58 resultados para valuation of new technology-based start ups
Resumo:
The introduction of a micro-electronic based technology to the workplace has had a far reaching and widespread effect on the numbers and content of jobs. The importance of the implications of new technology were recognised by the trade unions, leading to a plethora of advice and literature in the late 70s and early 80s, notably the TUC 'Technology and Employment ' report. However, studies into the union response have consistently found an overall lack of influence by unions in the introduction of technology. Whilst the advent of new technology has coincided with an industrial relations climate of unprecedented hostility to union activity in the post-war period, there are structural weaknesses in unions in coming to terms with the process of technological change. In particular was the identification of a lack of suitable technological expertise. Addressing itself to this perceived weakness of the union response, this thesis is the outcome of a collaborative project between a national union and an academic institution. The thesis is based on detailed case studies concerning technology bargaining in the Civil Service and the response of the Civil and Public Services Associations (CPSA), the union that represents lower grade white collar civil servants. It is demonstrated that the application of expertise to union negotiators is insufficient on its own to extend union influence and that for unions to effectively come to terms with technology and influence its development requires a re-assessment across all spheres of union activity. It is suggested that this has repercussions for not only the internal organisation and quality of union policy formation and the extent, form and nature of collective bargaining with employer representatives, but also in the relationship with consumer and interest groups outside the traditional collective bargaining forum. Three policy options are developed in the thesis with the 'adversarial' and 'co~operative' options representing the more traditional reactive and passive forms of involvement. These are contrasted with an 'independent participative' form of involvement which was a 'pro-active' policy option and utilised the expertise of the Author in the CPSA's response to technological change.
Resumo:
Objectives The creation of more high-growth firms continues to be a key component of enterprise policy throughout the countries of the OECD. In the UK the developing enterprise policy framework highlights the importance of supporting businesses with growth potential. The difficulty, of course, is the ability of those delivering business support policies to accurately identify those businesses, especially at start-up, which will benefit from interventions and experiences an enhanced growth performance. This paper has a core objective of presenting new data on the number of high growth firms in the UK and providing an assessment of their economic significance. Approach This paper uses a specially created longitudinal firm-level database based on the Inter-Departmental Business Register (IDBR) held by the Office of National Statistics (ONS) for all private sector businesses in the UK for the period 1997-2008 to investigate the share of high-growth firms (including a sub-set of start-up more commonly referred to as gazelles) in successive cohorts of start-ups. We apply OECD definitions of high growth and gazelles to this database and are able to quantify for the first time their number (disaggregated by sector, region, size) and importance (employment and sales). Prior Work However, what is lacking at the core of this policy focus is any comprehensive statistical analysis of the scale and nature of high-growth firms in cohorts of new and established businesses. The evidence base in response to the question “Why do high-growth firms matter?” is surprisingly weak. Important work in this area has been initiated by Bartelsman et al., (2003),Hoffman and Jünge (2006) and Henreksen and Johansson (2009) but to date work in the UK has been limited (BERR, 2008b). Results We report that there are ~11,500 high growth firms in the UK in both 2005 and 2008. The share of high growth start-ups in the UK in 2005 (6.3%) was, contrary to the widely held perception in policy circles, higher than in the United States (5.2%). Of particular interest in the analysis are the growth trajectories (pattern of growth) of these firms as well as the extent to which they are restricted to technology-based or knowledge-based sectors. Implications and Value Using hitherto unused population data for the first time we have answered a fundamental research and policy question on the number and scale of high growth firms in the UK. We draw the conclusion that this ‘rare’ event does not readily lend itself to policy intervention on the grounds that the significant effort needed to identify such businesses ex ante would appear unjustified even if it was possible.
Resumo:
The creation of new ventures is a process characterized by the need to decide and take action in the face of uncertainty, and this is particularly so in the case of technology-based ventures. Effectuation theory (Sarasvathy, 2001) has advanced two possible approaches for making decisions while facing uncertainty in the entrepreneurial process. Causation logic is based on prediction and aims at lowering uncertainty, whereas effectuation logic is based on non-predictive action and aims at working with uncertainty. This study aims to generate more fine-grained insight in the dynamics of effectuation and causation over time. We address the following questions: (1) What patterns can be found in effectual and causal behaviour of technology-based new ventures over time? And (2) How may patterns in the dynamics of effectuation and causation be explained?
Resumo:
ProxiMAX randomisation achieves saturation mutagenesis of contiguous codons without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, it uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents and as such, requires no specialised chemistry, reagents nor equipment. When particular residues are known to affect protein activity/specificity, their combinatorial replacement with all 20 amino acids, or a subset thereof, can provide a rapid route to generating proteins with desirable characteristics. Conventionally, saturation mutagenesis replaced key codons with degenerate ones. Although simple to perform, that procedure resulted in unnecessarily large libraries, termination codons and inherent uneven amino acid representation. ProxiMAX randomisation is an enzyme-based technique that can encode unbiased representation of all or selected amino acids or else can provide required codons in pre-defined ratios. Each saturated position can be defined independently of the others. ProxiMAX randomisation is achieved via saturation cycling: an iterative process comprising blunt end ligation, amplification and digestion with a Type IIS restriction enzyme. We demonstrate both unbiased saturation of a short 6-mer peptide and saturation of a hypervariable region of a scfv antibody fragment, where 11 contiguous codons are saturated with selected codons, in pre-defined ratios. As such, ProxiMAX randomisation is particularly relevant to antibody engineering. The development of ProxiMAX randomisation from concept to reality is described.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.
Resumo:
The existing literature has given little consideration to social values of information technology in general or of wireless technology in particular. The purpose of this paper is thus to shed new light on this issue. Based on an interpretive case study, we examine two healthcare organisations and discover that social values are often manifested beyond, as well as within, organisations. A matrix of social values in relation to technology changes and their interactions with various stakeholders is further discussed. The matrix helps understand how various social values emerge from and revolve around organisations’ strategic management of information technology. The implications of the findings about social values are discussed and future research directions are suggested.
Resumo:
The authors propose a new approach to discourse analysis which is based on meta data from social networking behavior of learners who are submerged in a socially constructivist e-learning environment. It is shown that traditional data modeling techniques can be combined with social network analysis - an approach that promises to yield new insights into the largely uncharted domain of network-based discourse analysis. The chapter is treated as a non-technical introduction and is illustrated with real examples, visual representations, and empirical findings. Within the setting of a constructivist statistics course, the chapter provides an illustration of what network-based discourse analysis is about (mainly from a methodological point of view), how it is implemented in practice, and why it is relevant for researchers and educators.
Resumo:
Back in 2003, we published ‘MAX’ randomisation, a process of non-degenerate saturation mutagenesis using exactly 20 codons (one for each amino acid) or else any required subset of those 20 codons. ‘MAX’ randomisation saturates codons located in isolated positions within a protein, as might be required in enzyme engineering, or else on one face of an alpha-helix, as in zinc finger engineering. Since that time, we have been asked for an equivalent process that can saturate multiple, contiguous codons in a non-degenerate manner. We have now developed ‘ProxiMAX’ randomisation, which does just that: generating DNA cassettes for saturation mutagenesis without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, ProxiMAX randomisation uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents. Thus it requires no specialised chemistry, reagents nor equipment and simply relies on a process of saturation cycling comprising ligation, amplification and digestion for each cycle. The process can encode both unbiased representation of selected amino acids or else encode them in pre-defined ratios. Each saturated position can be defined independently of the others. We demonstrate accurate saturation of up to 11 contiguous codons. As such, ProxiMAX randomisation is particularly relevant to antibody engineering.
Resumo:
Back in 2003, we published ‘MAX’ randomisation, a process of non-degenerate saturation mutagenesis using exactly 20 codons (one for each amino acid) or else any required subset of those 20 codons. ‘MAX’ randomisation saturates codons located in isolated positions within a protein, as might be required in enzyme engineering, or else on one face of an alpha-helix, as in zinc finger engineering. Since that time, we have been asked for an equivalent process that can saturate multiple, contiguous codons in a non-degenerate manner. We have now developed ‘ProxiMAX’ randomisation, which does just that: generating DNA cassettes for saturation mutagenesis without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, ProxiMAX randomisation uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents. Thus it requires no specialised chemistry, reagents nor equipment and simply relies on a process of saturation cycling comprising ligation, amplification and digestion for each cycle. The process can encode both unbiased representation of selected amino acids or else encode them in pre-defined ratios. Each saturated position can be defined independently of the others. We demonstrate accurate saturation of up to 11 contiguous codons. As such, ProxiMAX randomisation is particularly relevant to antibody engineering.
Resumo:
This review offers new perspectives on the subject and highlights an area in need of further research. It includes an analysis of current scientific literature mainly covering the last decade and examines the trends in the development of electronic, acoustic and optical-fiber humidity sensors over this period. The major findings indicate that a new generation of sensor technology based on optical fibers is emerging. The current trends suggest that electronic humidity sensors could soon be replaced by sensors that are based on photonic structures. Recent scientific advances are expected to allow dedicated systems to avoid the relatively high price of interrogation modules that is currently a major disadvantage of fiber-based sensors.
Resumo:
Abstract: Using data on all high- and medium-tech start-ups in the UK in 2000, this paper assesses the effect associated with a firm's decision to patent on a firm's subsequent growth between 2001 and 2005. We propose a new approach to addressing well known issues challenging identification of any patent effect: firm heterogeneity, simultaneity between firm performance and patenting, and sample selection. Our findings suggest that patentees have higher asset growth than non-patentees of between 8% and 27% per annum. © 2011 Elsevier B.V. All rights reserved.
Resumo:
Measuring variations in efficiency and its extension, eco-efficiency, during a restructuring period in different industries has always been a point of interest for regulators and policy makers. This paper assesses the impacts of restructuring of procurement in the Iranian power industry on the performance of power plants. We introduce a new slacks-based model for Malmquist-Luenberger (ML) Index measurement and apply it to the power plants to calculate the efficiency, eco-efficiency, and technological changes over the 8-year period (2003-2010) of restructuring in the power industry. The results reveal that although the restructuring had different effects on the individual power plants, the overall growth in the eco-efficiency of the sector was mainly due to advances in pure technology. We also assess the correlation between efficiency and eco-efficiency of the power plants, which indicates a close relationship between these two steps, thus lending support to the incorporation of environmental factors in efficiency analysis. © 2014 Elsevier Ltd.