192 resultados para Target Selection
em Cambridge University Engineering Department Publications Database
Resumo:
Matching a new technology to an appropriate market is a major challenge for new technology-based firms (NTBF). Such firms are often advised to target niche-markets where the firms and their technologies can establish themselves relatively free of incumbent competition. However, technologies are diverse in nature and do not benefit from identical strategies. In contrast to many Information and Communication Technology (ICT) innovations which build on an established knowledge base for fairly specific applications, technologies based on emerging science are often generic and so have a number of markets and applications open to them, each carrying considerable technological and market uncertainty. Each of these potential markets is part of a complex and evolving ecosystem from which the venture may have to access significant complementary assets in order to create and sustain commercial value. Based on dataset and case study research on UK advanced material university spin-outs (USO), we find that, contrary to conventional wisdom, the more commercially successful ventures were targeting mainstream markets by working closely with large, established competitors during early development. While niche markets promise protection from incumbent firms, science-based innovations, such as new materials, often require the presence, and participation, of established companies in order to create value. © 2012 IEEE.
Resumo:
This paper is part of a larger PhD research project examining the apparent conflict in UK planning between energy efficiency and conservation for the retrofit of the thermal envelope of the existing building stock. Review of the literature shows that the UK will not meet its 2050 emission reduction target without substantial improvement to the energy performance of the thermal envelope of the existing building stock and that significantly, 40% of the existing stock has heritage status and may be exempted from Building Regulations. A review of UK policy and legislation shows that there are clear national priorities towards reducing emissions and addressing climate change, yet also shows a movement towards local decision making and control. This paper compares the current status of thirteen London Boroughs in respect to their position on thermal envelope retrofit for heritage and traditionally constructed buildings. Data collection is through ongoing surveys and interviews that compare statistical data, planning policies, sustainability and environmental priorities, and Officer decision-making. This paper finds that there is a lack of consistency in application of planning policy across Boroughs and suggests that this is a barrier to the up-take of energy efficient retrofit. Various recommendations are suggested at both national and local level which could help UK planning and planning officers deliver more energy efficient heritage retrofits.
Resumo:
A significant cost in obtaining acoustic training data is the generation of accurate transcriptions. For some sources close-caption data is available. This allows the use of lightly-supervised training techniques. However, for some sources and languages close-caption is not available. In these cases unsupervised training techniques must be used. This paper examines the use of unsupervised techniques for discriminative training. In unsupervised training automatic transcriptions from a recognition system are used for training. As these transcriptions may be errorful data selection may be useful. Two forms of selection are described, one to remove non-target language shows, the other to remove segments with low confidence. Experiments were carried out on a Mandarin transcriptions task. Two types of test data were considered, Broadcast News (BN) and Broadcast Conversations (BC). Results show that the gains from unsupervised discriminative training are highly dependent on the accuracy of the automatic transcriptions. © 2007 IEEE.
Resumo:
Variable selection for regression is a classical statistical problem, motivated by concerns that too large a number of covariates may bring about overfitting and unnecessarily high measurement costs. Novel difficulties arise in streaming contexts, where the correlation structure of the process may be drifting, in which case it must be constantly tracked so that selections may be revised accordingly. A particularly interesting phenomenon is that non-selected covariates become missing variables, inducing bias on subsequent decisions. This raises an intricate exploration-exploitation tradeoff, whose dependence on the covariance tracking algorithm and the choice of variable selection scheme is too complex to be dealt with analytically. We hence capitalise on the strength of simulations to explore this problem, taking the opportunity to tackle the difficult task of simulating dynamic correlation structures. © 2008 IEEE.
Resumo:
Sensor networks can be naturally represented as graphical models, where the edge set encodes the presence of sparsity in the correlation structure between sensors. Such graphical representations can be valuable for information mining purposes as well as for optimizing bandwidth and battery usage with minimal loss of estimation accuracy. We use a computationally efficient technique for estimating sparse graphical models which fits a sparse linear regression locally at each node of the graph via the Lasso estimator. Using a recently suggested online, temporally adaptive implementation of the Lasso, we propose an algorithm for streaming graphical model selection over sensor networks. With battery consumption minimization applications in mind, we use this algorithm as the basis of an adaptive querying scheme. We discuss implementation issues in the context of environmental monitoring using sensor networks, where the objective is short-term forecasting of local wind direction. The algorithm is tested against real UK weather data and conclusions are drawn about certain tradeoffs inherent in decentralized sensor networks data analysis. © 2010 The Author. Published by Oxford University Press on behalf of The British Computer Society. All rights reserved.
Resumo:
We present a stochastic simulation technique for subset selection in time series models, based on the use of indicator variables with the Gibbs sampler within a hierarchical Bayesian framework. As an example, the method is applied to the selection of subset linear AR models, in which only significant lags are included. Joint sampling of the indicators and parameters is found to speed convergence. We discuss the possibility of model mixing where the model is not well determined by the data, and the extension of the approach to include non-linear model terms.