982 resultados para Computer industry


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a study of the sources of new product ideas and the development of new product proposals in an organisation in the UK Computer Industry. The thesis extends the work of von Hippel by showing how the phenomenon which he describes as "the Customer Active Paradigm for new product idea generation" can be observed to operate in this Industry. Furthermore, this thesis contrasts his Customer Active Paradigm with the more usually encountered Manufacturer Active Paradigm. In a second area, the thesis draws a number of conclusions relating to methods of market research, confirming existing observations and demonstrating the suitability of flexible interview strategies in certain circumstances. The thesis goes on to demonstrate the importance of free information flow within the organisation, making it more likely that sought and unsought opportunities can be exploited. It is shown that formal information flows and documents are a necessary but not sufficient means of influencing the formation of the organisation's dominant ideas on new product areas. The findings also link the work of Tushman and Katz on the role of "Gatekeepers" with the work of von Hippel by showing that the role of gatekeeper is particularly appropriate and useful to an organisation changing from Customer Active to Manufacturer Active methods of idea generation. Finally, the thesis provides conclusions relating to the exploitation of specific new product opportunities facing the sponsoring organisation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Purpose – To determine whether or not clockspeed is an important variable in outsourcing strategies throughout the development of radical innovations. Design/methodology/approach – An internet-based survey of manufacturing firms from all over the world. Findings – An industry's clockspeed does not play a significant role in the success or failure of a particular outsourcing strategy for a radical innovation. Research limitations/implications – Conclusions from earlier research in this area are not necessarily industry-specific. Practical implications – Lessons learned via previous investigations about the computer industry need not be confined to that sector. Vertical integration may be a more robust outsourcing strategy when developing a radical innovation in industries of all clockspeeds. Originality/value – Previous research efforts in this field focused on a single technology jump, but this approach may have overlooked a potentially important variable: industry clockspeed. Thus, this investigation explores whether clockspeed is an important factor.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Industry clockspeed has been used in earlier literature to assess the rate of change of industries but this measure remains limited in its application in longitudinal analyses as well as in systemic industry contexts. Nevertheless, there is a growing need for such a measure as business ecosystems replace standalone products and organisations are required to manage their innovation process in increasingly systemic contexts. In this paper, we firstly derive a temporal measure of technological industry clockspeed, which evaluates the time between successively higher levels of performance in the industry's product technology, over time. We secondly derive a systemic technological industry clockspeed for systemic industry contexts, which measures the time required for a particular sub-industry to utilise the level of technological performance that is provisioned by another, interdependent sub-industry. In turn, we illustrate the use of these measures in an empirical study of the systemic personal computer industry. The results of our empirical illustration show that the proposed clockspeeds together provide informative measures of the pace of change for sub-industries and systemic industry. We subsequently discuss the organisational considerations and theoretical implications of the proposed measures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: In this work, tension, impact, bend and fatigue tests were conducted in an AM60 magnesium alloy. The effects of environmental temperature and loading rates on impact and tension behavior of the alloy were also investigated. Design/methodology/approach: The tests were conducted using an Instron universal testing machine. The loading speed was changed from 1 mm/min to 300 mm/min to gain a better understanding of the effect of strain rate. To understand the failure behavior of this alloy at different environmental temperatures, Charpy impact test was conducted in a range of temperatures (-40~35°C). Plane strain fracture toughness (KIC) was evaluated using compact tension (CT) specimen. To gain a better understanding of the failure mechanisms, all fracture surfaces were observed using scanning electron microscopy (SEM). In addition, fatigue behavior of this alloy was estimated using tension test under tension-tension condition at 30 Hz. The stress amplitude was selected in the range of 20~50 MPa to obtain the S-N curve. Findings: The tensile test indicated that the mechanical properties were not sensitive to the strain rates applied (3.3x10-4~0.1) and the plastic deformation was dominated by twining mediated slip. The impact energy is not sensitive to the environmental temperature. The plane strain fracture toughness and fatigue limit were evaluated and the average values were 7.6 MPa.m1/2 and 25 MPa, respectively. Practical implications: Tested materials AM60 Mg alloy can be applied among others in automotive industry aerospace, communication and computer industry. Originality/value: Many investigations have been conducted to develop new Mg alloys with improved stiffness and ductility. On the other hand, relatively less attention has been paid to the failure mechanisms of Mg alloys, such as brittle fracture and fatigue, subjected to different environmental or loading conditions. In this work, tension, impact, bend and fatigue tests were conducted in an AM60 magnesium alloy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many systemic, complex technologies have been suggested to exhibit increasing returns to adoption, whereby the initial increase in adoption leads to increasing experience with the technology, which drives technological improvements and use, subsequently leading to further adoption. In addition, in the systemic context, mimetic behavior may lend support to increasing returns as technology adoption is witnessed among other agents in the systemic context. Finally, inter-dependencies in the systemic context also sensitize the adoption behavior to fundamental changes in technology provisioning, and this may lend support for the increasing returns type of dynamics in adoption. Our empirical study examines the dynamics of organizational technology adoption when technology is provisioned by organizations in another sub-system in a systemic context. We hypothesize that innovation, imitation, and technological change effects are present in creating increasing returns in the systemic context. Our empirical setting considers 24 technologies represented by 2282 data points in the computer industry. Our results provide support for our prediction that imitation effects are present in creating increasing returns to adoption. We further discuss the managerial and research implications of our results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper looks at intervention programmes to improve the representation of female students in computing education and the computer industry, A multiple case study methodology was used to look at major intervention programmes conducted in Australia. One aspect of the research focused on the programme champions; those women from the computing industry, those working within government organisations and those in academia who instigated the programmes. The success of these intervention programmes appears to have been highly dependent upon not only the design of the programme but on the involvement of these strong individuals who were passionate and worked tirelessly to ensure the programme's success. This paper provides an opportunity for the voices of these women to be heard. It describes the champions' own initial involvement with computing which frequently motivated and inspired them to conduct such programmes. The research found that when these types of intervention programmes were conducted by academic staff the work was undervalued compared to when the activities were conducted by staff in industry or in government. The academic environment was often not supportive of academics who conducted intervention programmes for female students.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two issues were addressed. 1. Women are underrepresented in computing courses and in the computing workplace. Despite almost two decades of recognition of the issue and of intervention to correct it, the proportion of women in computing continues to decline. 2. There is a shortage of people with appropriate skills and qualifications in computing, and, more specifically, a need for people with particular personality attributes. There is an increasing demand for computing personnel to have good communication and interpersonal skills, but the predominant personality types of computing people do not include these characteristics. The research relating to the underrepresentation of women was conducted as a series of interviews with university students, female computing professionals and secondary school girls. The main findings of these studies were: 1) schoolgirls are interested in careers that are interesting and varied and provide opportunities for interaction with others; 2) schoolgirls perceive computing as involving working alone; 3) women working in computing describe careers that are interesting, varied, and people-oriented; 4) tertiary computing students equated 'computing' with 'programming'; and 5) single interventions are unlikely to result in individuals in the targeted group deciding to study computing. The perception of schoolgirls that computing involves working alone, which is reinforced by many tertiary computing courses, suggested that the type of person who is likely to be attracted to computing is one who would prefer to work alone. It was predicted that schoolboys would have similar perceptions of computing. Thus, computing is likely to attract students who would prefer to work alone. For various social and stereotypical reasons addressed by previous research, these students will be predominantly male. In the final study, preferred Myers-Briggs Type Indicator and Strong Interest Inventory personality types were suggested for computer programmers, systems designers and systems analysts. The existing literature and the 'types' of 72 study participants tended to confirm that 1) certain personality types are overrepresented in computing; 2) these types are well suited to programming and design tasks; and 3) there is an underrepresentation of individuals who have the combination of analytical, communication and people skills that are required particularly of analysts but also of many others working in computing today. Interviews with participants supported the earlier findings that computing careers are perceived by students to be technical and involve working in isolation, but for many computing people this is not the reality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As with the litigation involving its predecessor Napster, cases involving the Australian based P2P service Kazaa and its US licensees Grokster and Morpheus required from the courts to balance the legitimate interests of the computer industry and the public in new and advanced technologies on the one hand and of so-called "content providers " of the media and entertainment industry on the other hand. The article examines, how US and Australian courts have approached this task and, in spite of differences in the legal frameworks of the two countries, have reached similar conclusions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Há mais de 30 anos o Brasil tem desenvolvido políticas específicas para o setor de informática, desde a Política Nacional de Informática da década de 70, passando pelo Período de Reserva de Mercado dos anos 80 e, nos dias de hoje, em que as Tecnologias de Informação e Comunicação (TIC) são tidas como uma das áreas prioritárias na Política Industrial. Dentre as metas atuais, destaca-se o foco na ampliação do volume de exportações de software e serviços. Contudo, apesar dessas pretensões, o país não tem tido destaque internacional expressivo para o setor. Por outro lado, a Índia, também considerada como um país emergente, figurando na lista dos BRIC, foi responsável pela exportação de cerca de US$47 bilhões em software e serviços de Tecnologia da Informação (TI) em 2009, se destacando como um país protagonista no mercado internacional do setor. A implementação de uma indústria tecnicamente sofisticada como a do software, que exige um ambiente propício à inovação, em um país em desenvolvimento como a Índia chama a atenção. De certo existiram arranjos jurídico-institucionais que foram utilizados naquele país. Quais? Em que medida tais arranjos ajudaram no desenvolvimento indiano do setor? E no Brasil? Este trabalho parte da hipótese de que o ambiente jurídico-institucional desses países definiu fluxos de conhecimento distintos, influenciando o tipo de desenvolvimento do setor de software de cada um. Averiguar como, entre outros fatores sócio-econômicos, esses arranjos jurídico-institucionais influenciaram na conformação diversa de fluxos de conhecimento é o objetivo específico desta pesquisa. Entende-se aqui como ambiente jurídico-institucional todas as regulamentações que estabelecem instituições, diretrizes e condições comuns para determinado tema. Partindo do pressuposto de que o setor de software desenvolve atividades intensivas em conhecimento, para cada país em questão, serão analisados apenas arranjos jurídico-institucionais que tiveram, ou têm, poder de delimitar o fluxo de conhecimento referente ao setor, sejam eles provenientes de políticas comerciais (de exportação e importação, ou de propriedade intelectual) ou de políticas de investimento para inovação. A questão fundamental ultrapassa o debate se o Estado deve ou não intervir, para focar-se na análise sobre os diferentes tipos de envolvimento observados e quais os seus efeitos. Para tal, além de revisão bibliográfica, foi feita uma pesquisa de campo na Índia (Delhi, Mumbai, Bangalore) e no Brasil (São Paulo, Brasília e Rio de Janeiro), onde foram conduzidas entrevistas com empresas e associações de software, gestores públicos e acadêmicos que estudam o setor.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die rasante Entwicklung der Computerindustrie durch die stetige Verkleinerung der Transistoren führt immer schneller zum Erreichen der Grenze der Si-Technologie, ab der die Tunnelprozesse in den Transistoren ihre weitere Verkleinerung und Erhöhung ihrer Dichte in den Prozessoren nicht mehr zulassen. Die Zukunft der Computertechnologie liegt in der Verarbeitung der Quanteninformation. Für die Entwicklung von Quantencomputern ist die Detektion und gezielte Manipulation einzelner Spins in Festkörpern von größter Bedeutung. Die Standardmethoden der Spindetektion, wie ESR, erlauben jedoch nur die Detektion von Spinensembles. Die Idee, die das Auslesen von einzelnen Spins ermöglich sollte, besteht darin, die Manipulation getrennt von der Detektion auszuführen.rn Bei dem NV−-Zentrum handelt es sich um eine spezielle Gitterfehlstelle im Diamant, die sich als einen atomaren, optisch auslesbaren Magnetfeldsensor benutzen lässt. Durch die Messung seiner Fluoreszenz sollte es möglich sein die Manipulation anderer, optisch nicht detektierbaren, “Dunkelspins“ in unmittelbarer Nähe des NV-Zentrums mittels der Spin-Spin-Kopplung zu detektieren. Das vorgeschlagene Modell des Quantencomputers basiert auf dem in SWCNT eingeschlossenen N@C60.Die Peapods, wie die Einheiten aus den in Kohlenstoffnanoröhre gepackten Fullerenen mit eingefangenem Stickstoff genannt werden, sollen die Grundlage für die Recheneinheiten eines wahren skalierbaren Quantencomputers bilden. Die in ihnen mit dem Stickstoff-Elektronenspin durchgeführten Rechnungen sollen mit den oberflächennahen NV-Zentren (von Diamantplatten), über denen sie positioniert sein sollen, optisch ausgelesen werden.rnrnDie vorliegende Arbeit hatte das primäre Ziel, die Kopplung der oberflächennahen NV-Einzelzentren an die optisch nicht detektierbaren Spins der Radikal-Moleküle auf der Diamantoberfläche mittels der ODMR-Kopplungsexperimente optisch zu detektieren und damit entscheidende Schritte auf dem Wege der Realisierung eines Quantenregisters zu tun.rn Es wurde ein sich im Entwicklungsstadium befindende ODMR-Setup wieder aufgebaut und seine bisherige Funktionsweise wurde an kommerziellen NV-Zentrum-reichen Nanodiamanten verifiziert. Im nächsten Schritt wurde die Effektivität und Weise der Messung an die Detektion und Manipulation der oberflächennah (< 7 nm Tiefe) implantieren NV-Einzelzenten in Diamantplatten angepasst.Ein sehr großer Teil der Arbeit, der hier nur bedingt beschrieben werden kann, bestand aus derrnAnpassung der existierenden Steuersoftware an die Problematik der praktischen Messung. Anschließend wurde die korrekte Funktion aller implementierten Pulssequenzen und anderer Software-Verbesserungen durch die Messung an oberflächennah implantierten NV-Einzelzentren verifiziert. Auch wurde der Messplatz um die zur Messung der Doppelresonanz notwendigen Komponenten wie einen steuerbaren Elektromagneten und RF-Signalquelle erweitert. Unter der Berücksichtigung der thermischen Stabilität von N@C60 wurde für zukünftige Experimente auch ein optischer Kryostat geplant, gebaut, in das Setup integriert und charakterisiert.rn Die Spin-Spin-Kopplungsexperimente wurden mit dem sauerstoffstabilen Galvinoxyl-Radikalals einem Modell-System für Kopplung durchgeführt. Dabei wurde über die Kopplung mit einem NVZentrum das RF-Spektrum des gekoppelten Radikal-Spins beobachtet. Auch konnte von dem gekoppelten Spin eine Rabi-Nutation aufgenommen werden.rn Es wurden auch weitere Aspekte der Peapod Messung und Oberflächenimplantation betrachtet.Es wurde untersucht, ob sich die NV-Detektion durch die SWCNTs, Peapods oder Fullerene stören lässt. Es zeigte sich, dass die Komponenten des geplanten Quantencomputers, bis auf die C60-Cluster, für eine ODMR-Messanordnung nicht detektierbar sind und die NV-Messung nicht stören werden. Es wurde auch betrachtet, welche Arten von kommerziellen Diamantplatten für die Oberflächenimplantation geeignet sind, für die Kopplungsmessungen geeignete Dichte der implantierten NV-Zentren abgeschätzt und eine Implantation mit abgeschätzter Dichte betrachtet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is a puzzling, little-remarked contradiction in scholarly views of the European Commission. On the one hand, the Commission is seen as the maestro of European integration, gently but persistently guiding both governments and firms toward Brussels. On the other hand, the Commission is portrayed as a headless bunch of bickering fiefdoms who can hardly be bothered by anything but their own in­ ternecine turf wars. The reason these very different views of the same institution have so seldom come into conflict is quite apparent: EU studies has a set of relatively autonomous and poorly integrated sub­ fields that work at different levels of analysis. Those scholars holding the "heroic" view of the Com­ mission are generally focused on the contest between national and supranational levels that character­ ized the 1992 program and subsequent major steps toward European integration. By contrast, those scholars with the "bureaucratic politics" view are generally authors of case studies or legislative his­ tories of individual EU directives or decisions. However, the fact that these twO images of the Commis­ sion are often two ships passing in the night hardly implies that there is no dispute. Clearly both views cannot be right; but then, how can we explain the significant support each enjoys from the empirical record? The CommiSSion, perhaps the single most important supranational body in the world, certainly deserves better than the schizophrenic interpretation the EU studies community has given it. In this paper, I aim to make a contribution toward the unraveling of this paradox. In brief, the argument I make is as follows: the European Commission can be effective in pursuit of its broad integration goals in spite of, and even because of, its internal divisions. The folk wisdom that too many chefs spoil the broth may often be true, but it need not always be so. The paper is organized as follows. 1 begin with an elaboration of the theoretical position briefly out­ lined above. 1 then tum to a case study from the major Commission efforts to restructure the computer industry in the context of its 1992 program. The computer sector does not merely provide interesting, random illustrations of the hypothesis 1 have advanced. Rather, as Wayne Sandholtz and John Zysman have stressed, the Commission's efforts on informatics formed one of the most crucial parts of the en­ tire 1992 program, and so the Commission's success in "Europeanizing" these issues had significant ripple effects across the entire European political economy. I conclude with some thoughts on the fol­ lowing question: now that the Commission has succeeded in bringing the world to its doorstep, does its bureaucratic division still serve a useful purpose?