32 resultados para Perron’s eigenvector
em Queensland University of Technology - ePrints Archive
Resumo:
A small group of companies including Intel, Microsoft, and Cisco have used "platform leadership" with great effect as a means for driving innovation and accelerating market growth within their respective industries. Prior research in this area emphasizes that trust plays a critical role in the success of this strategy. However, many of the categorizations of trust discussed in the literature tend to ignore or undervalue the fact that trust and power are often functionally equivalent, and that the coercion of weaker partners is sometimes misdiagnosed as collaboration. In this paper, I use case study data focusing on Intel's shift from ceramic/wire-bonded packaging to organic/C4 packaging to characterize the relationships between Intel and its suppliers, and to determine if these links are based on power in addition to trust. The case study shows that Intel's platform leadership strategy is built on a balance of both trust and a relatively benevolent form of power that is exemplified by the company's "open kimono" principle, through which Intel insists that suppliers share detailed financial data and highly proprietary technical information to achieve mutually advantageous objectives. By explaining more completely the nature of these inter-firm linkages, this paper usefully extends our understanding of how platform leadership is maintained by Intel, and contributes to the literature by showing how trust and power can be used simultaneously within an inter-firm relationship in a way that benefits all of the stakeholders.
Resumo:
Social capital plays an important role in explaining how value is created from firms' network relationships, but little is understood about how social capital is shaped over time and how it is re-shaped when firms consolidate their network ties. In response, this study explores the evolution of social capital in buyer–supplier relationships through a case study of a company undertaking radical product innovation, and examines the corresponding changes in the firm's network of buyer–supplier relationships. The analysis shows that social capital is built in a decidedly non-linear and non-uniform manner. The study also reveals considerable interaction among the dimensions of social capital throughout the evolution of the firm's network, and emphasizes the importance of the cognitive dimension—a feature receiving little attention thus far. The evidence shows, too, that efforts to strengthen social capital need to increase when network ties are sacrificed to prevent unintended consequences for firms' longer-term value creation.
Resumo:
Prior evidence from the fields of innovation management and supplier relations predicts that Japanese firms should be naturally disadvantaged in developing and deploying radical innovations. But this conclusion is inconsistent with recent developments in the automotive industry. This paper presents secondary case study data focusing on fuel cell powered vehicles and hybrid cars to show that Toyota, one of Japan's largest and most influential corporations, is capable of developing radically new technologies, and is in several respects better at this sort of innovation than the rest of the global automotive industry.
Resumo:
The concept of ‘strategic dalliances’– defined as non-committal relationships that companies can ‘dip in and out of,’ or dally with, while simultaneously maintaining longer-term strategic partnerships with other firms and suppliers – has emerged as a promising strategy by which organizations can create discontinuous innovations. But does this approach work equally well for every sector? Moreover, how can these links be effectively used to foster the process of discontinuous innovation? Toward assessing the role that industry clockspeed plays in the success or failure of strategic dalliances, we provide case study evidence from Twister BV, an upstream oil and gas technology provider, and show that strategic dalliances can be an enabler for the discontinuous innovation process in slow clockspeed industries. Implications for research and practice are discussed, and conclusions from our findings are drawn.
Resumo:
Overall, computer models and simulations have a rather disappointing record within the management sciences as a tool for predicting the future. Social and market environments can be influenced by an overwhelming number of variables, and it is therefore difficult to use computer models to make forecasts or to test hypotheses concerning the relationship between individual behaviours and macroscopic outcomes. At the same time, however, advocates of computer models argue that they can be used to overcome the human mind's inability to cope with several complex variables simultaneously or to understand concepts that are highly counterintuitive. This paper seeks to bridge the gap between these two perspectives by suggesting that management research can indeed benefit from computer models by using them to formulate fruitful hypotheses.
Resumo:
Purpose of this paper – The purpose of this investigation is to help establish: whether or not strong relationships between suppliers and customers improve performance; and if prescriptive frameworks on outsourcing radical innovations are dependent on industry clockspeed. Design/methodology/approach – A survey of UK-based manufacturers, followed by a statistical analysis. Findings – Long-term supplier links seem not to play a role in the development of radical innovations. Moreover, industry clockspeed has no significant bearing on the success or failure of any outsourcing strategy for radically new technologies. Research limitations/implications – Literature about outsourcing in the face of radical innovation can be more confidently applied to industries of all clockspeeds. Practical implications – Prescriptions for fast clockspeed industries should be applied more broadly: all industries should maintain a high degree of vertical integration in the early days of a radical innovation. Originality/value – Prior papers had explored whether or not a company should outsource radical innovations, but none had determined if this is equally true for slow industries and fast ones. Therein lies the original contribution of this paper.
Resumo:
Some evidence in the area of make-buy decisions for new technologies suggests that it is a good idea for a company to pursue a fairly rigorous ''make'' policy in the early days of a potentially disruptive innovation. Other studies prescribe exactly the opposite, promoting instead a ''buy'' strategy. This paper seeks to bridge the gap between these perspectives by suggesting that both strategies are valid, but that they are most successfully applied in different market environments. The ''make'' prescription may be more suited to either extremely fast or extremely slow rates of technological change, while a ''buy'' strategy might be more appropriate in market sectors where technologies evolve at a medium pace. This paper highlights the importance of industry clockspeed and supplier relationships in make-buy decisions for new technologies, and puts forward two new hypotheses that require empirical testing.
Resumo:
Purpose – To determine whether or not clockspeed is an important variable in outsourcing strategies throughout the development of radical innovations. Design/methodology/approach – An internet-based survey of manufacturing firms from all over the world. Findings – An industry's clockspeed does not play a significant role in the success or failure of a particular outsourcing strategy for a radical innovation. Research limitations/implications – Conclusions from earlier research in this area are not necessarily industry-specific. Practical implications – Lessons learned via previous investigations about the computer industry need not be confined to that sector. Vertical integration may be a more robust outsourcing strategy when developing a radical innovation in industries of all clockspeeds. Originality/value – Previous research efforts in this field focused on a single technology jump, but this approach may have overlooked a potentially important variable: industry clockspeed. Thus, this investigation explores whether clockspeed is an important factor.
Resumo:
Significant empirical data from the fields of management and business strategy suggest that it is a good idea for a company to make in-house the components and processes underpinning a new technology. Other evidence suggests exactly the opposite, saying that firms would be better off buying components and processes from outside suppliers. One possible explanation for this lack of convergence is that earlier research in this area has overlooked two important aspects of the problem: reputation and trust. To gain insight into how these variables may impact make-buy decisions throughout the innovation process, the Sporas algorithm for measuring reputation was added to an existing agent-based model of how firms interact with each other throughout the development of new technologies. The model�s results suggest that reputation and trust do not play a significant role in the long-term fortunes of an individual firm as it contends with technological change in the marketplace. Accordingly, this model serves as a cue for management researchers to investigate more thoroughly the temporal limitations and contingencies that determine how the trust between firms may affect the R&D process.
Resumo:
The technology and innovation management literature offers somewhat conflicting evidence with regards to the formation of spinoff companies for radically new technologies. Sometimes spinoffs seem to be a very effective strategy—but not always. An obvious question emerges: under what conditions is a spinoff the best way to pursue a radical technology? This paper sheds light on this question by presenting case study evidence from spinoff firms within the Shell Technology Ventures portfolio. The data point to industry clockspeed as a potentially important variable in the decision to create a spinoff or not.
Resumo:
Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology (IT) infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry’s technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry’s services to be offered through cloud-based “apps.”
Resumo:
Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry's technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry's services to be offered through cloud-based “apps.”
Resumo:
In light of the high stakes of the deepwater horizon civil trial and the important precedent-setting role that the case will have on the assessment of future marine disasters, the methodologies underpinning the calculations of damage on both sides will be subjected to considerable scrutiny. Despite the importance of the case, however, there seems to be a pronounced lack of convergence about it in the academic literature. Contributions from scientific journals frequently make comparisons to the Ixtoc I oil spill off the coast of Mexico in 1979; the legal literature, by stark contrast, seems to be much more focused on the Exxon Valdez spill that occurred off the shores of Alaska in 1989. This paper accordingly calls for a more thorough consideration of other analogs beyond the Exxon Valdez spill—most notably, the Ixtoc I incident—in arriving at an assessment of the damage caused by the Deepwater Horizon disaster.