2 resultados para Bayesian Normal Mixture Model, Data Binning, Data Analysis
em Glasgow Theses Service
Resumo:
This research explores the business model (BM) evolution process of entrepreneurial companies and investigates the relationship between BM evolution and firm performance. Recently, it has been increasingly recognised that the innovative design (and re-design) of BMs is crucial to the performance of entrepreneurial firms, as BM can be associated with superior value creation and competitive advantage. However, there has been limited theoretical and empirical evidence in relation to the micro-mechanisms behind the BM evolution process and the entrepreneurial outcomes of BM evolution. This research seeks to fill this gap by opening up the ‘black box’ of the BM evolution process, exploring the micro-patterns that facilitate the continuous shaping, changing, and renewing of BMs and examining how BM evolutions create and capture value in a dynamic manner. Drawing together the BM and strategic entrepreneurship literature, this research seeks to understand: (1) how and why companies introduce BM innovations and imitations; (2) how BM innovations and imitations interplay as patterns in the BM evolution process; and (3) how BM evolution patterns affect firm performances. This research adopts a longitudinal multiple case study design that focuses on the emerging phenomenon of BM evolution. Twelve entrepreneurial firms in the Chinese Online Group Buying (OGB) industry were selected for their continuous and intensive developments of BMs and their varying success rates in this highly competitive market. Two rounds of data collection were carried out between 2013 and 2014, which generates 31 interviews with founders/co-founders and in total 5,034 pages of data. Following a three-stage research framework, the data analysis begins by mapping the BM evolution process of the twelve companies and classifying the changes in the BMs into innovations and imitations. The second stage focuses down to the BM level, which addresses the BM evolution as a dynamic process by exploring how BM innovations and imitations unfold and interplay over time. The final stage focuses on the firm level, providing theoretical explanations as to the effects of BM evolution patterns on firm performance. This research provides new insights into the nature of BM evolution by elaborating on the missing link between BM dynamics and firm performance. The findings identify four patterns of BM evolution that have different effects on a firm’s short- and long-term performance. This research contributes to the BM literature by presenting what the BM evolution process actually looks like. Moreover, it takes a step towards the process theory of the interplay between BM innovations and imitations, which addresses the role of companies’ actions, and more importantly, reactions to the competitors. Insights are also given into how entrepreneurial companies achieve and sustain value creation and capture by successfully combining the BM evolution patterns. Finally, the findings on BM evolution contributes to the strategic entrepreneurship literature by increasing the understanding of how companies compete in a more dynamic and complex environment. It reveals that, the achievement of superior firm performance is more than a simple question of whether to innovate or imitate, but rather an integration of innovation and imitation strategies over time. This study concludes with a discussion of the findings and their implications for theory and practice.
Resumo:
The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.