776 resultados para Beyond Baci
Resumo:
There is a clear and increasing interest in short time annealing processing far below one second, i.e. the lower limit of Rapid Thermal Processing (RTP) called spike annealing. This was driven by the need of suppressing the so-called Transient Enhanced Diffusion in advanced boronimplanted shallow pn-junctions in silicon technology. Meanwhile the interest in flash lamp annealing (FLA) in the millisecond range spread out into other fields related to silicon technology and beyond. This paper reports on recent experiments regarding shallow junction engineering in germanium, annealing of ITO layers on glass and plastic foil to form an conductive layer as well as investigations which we did during the last years in the field of wide band gap semiconductor materials (SiC, ZnO). A more common feature evolving from our work was related to the modeling of wafer stress during millisecond thermal processing with flash lamps. Finally recent achievements in the field of silicon-based light emission basing on Metal-Oxide-Semiconductor Light Emitting Devices will be reported. © 2007 IEEE.
Resumo:
Purpose: This paper seeks to measure in a quantitative way the degree of alignment among a set of performance measures between two organizations. Design/methodology/approach: This paper extends Venkatraman's test of coalignment to assess the alignment of a set of performance measures governing a contractual inter-organizational relationship. The authors applied the test and present coefficients of misalignment across three sets of measures: those used by a service provider involved in the research, those used by customers contracting the services, and those documented in 11 contracts studied. Findings: Results confirmed a high degree of alignment between target and actual operational performance in the contracts. The alignment of customers' financial objectives and contracts' operational metrics was low. Calculations show poor alignment between the objectives of the provider and the contribution received from the contracts. Research limitations/implications: Some limitations of the conclusions include the small sample of contracts used in the calculations. Further research should include not only actual contracts, but also failed ones. Practical implications: It is possible that misaligned goals, represented in misaligned performance measures, lead to tensions in intra-firm relationships. If these tensions are not addressed properly the relationship could be unstable or terminated prematurely. This method of measuring alignment could detect early potential dangers in intra-firm relationships. Originality/value: This paper extends Venkatraman's test of coalignment to assess the alignment of a set of performance measures governing a contractual inter-organizational relationship. Management researchers and business professionals may use this methodology when exploring degrees of alignment of performance measures in intra-functional and inter-firm relationships. © Emerald Group Publishing Limited.
Resumo:
Interest is growing around the application of lean techniques to new product introduction (NPI). Although a relatively emergent topic compared with the application of 'lean' within the factory, since 2000 there has been an exponential rise in the literature on this subject. However, much of this work focuses on describing and extolling the virtues of the 'Toyota approach' to design. Therefore, by way of a stock take for the UK, the present authors' research has set out to understand how well lean product design practices have been adopted by leading manufacturers. This has been achieved by carrying out in-depth case studies with three carefully selected manufacturers of complex engineered products. This paper describes these studies, the detailed results and subsequent findings, and concludes that both the awareness and adoption of practices is generally embryonic and far removed from the theory advocated in the literature. © IMechE 2007.
Resumo:
This session described the FET Flagship Pilot on graphene and related two-dimensional materials. The flagship targets a revolution in information and communication technology, with impacts reaching into other areas of the society. The session featured four talks on the scientific and technological potential and open research challenges within the scope of the proposed flagship, industrial view on possibilities and challenges posed by graphene and related materials, and presentation on the implementation and structure of the flagship pilot. © Selection and peer-review under responsibility of FET11 conference organizers and published by Elsevier B.V.
Resumo:
Matching a new technology to an appropriate market is a major challenge for new technology-based firms (NTBF). Such firms are often advised to target niche-markets where the firms and their technologies can establish themselves relatively free of incumbent competition. However, technologies are diverse in nature and do not benefit from identical strategies. In contrast to many Information and Communication Technology (ICT) innovations which build on an established knowledge base for fairly specific applications, technologies based on emerging science are often generic and so have a number of markets and applications open to them, each carrying considerable technological and market uncertainty. Each of these potential markets is part of a complex and evolving ecosystem from which the venture may have to access significant complementary assets in order to create and sustain commercial value. Based on dataset and case study research on UK advanced material university spin-outs (USO), we find that, contrary to conventional wisdom, the more commercially successful ventures were targeting mainstream markets by working closely with large, established competitors during early development. While niche markets promise protection from incumbent firms, science-based innovations, such as new materials, often require the presence, and participation, of established companies in order to create value. © 2012 IEEE.
Resumo:
Developing a theoretical description of turbulent plumes, the likes of which may be seen rising above industrial chimneys, is a daunting thought. Plumes are ubiquitous on a wide range of scales in both the natural and the man-made environments. Examples that immediately come to mind are the vapour plumes above industrial smoke stacks or the ash plumes forming particle-laden clouds above an erupting volcano. However, plumes also occur where they are less visually apparent, such as the rising stream of warmair above a domestic radiator, of oil from a subsea blowout or, at a larger scale, of air above the so-called urban heat island. In many instances, not only the plume itself is of interest but also its influence on the environment as a whole through the process of entrainment. Zeldovich (1937, The asymptotic laws of freely-ascending convective flows. Zh. Eksp. Teor. Fiz., 7, 1463-1465 (in Russian)), Batchelor (1954, Heat convection and buoyancy effects in fluids. Q. J. R. Meteor. Soc., 80, 339-358) and Morton et al. (1956, Turbulent gravitational convection from maintained and instantaneous sources. Proc. R. Soc. Lond. A, 234, 1-23) laid the foundations for classical plume theory, a theoretical description that is elegant in its simplicity and yet encapsulates the complex turbulent engulfment of ambient fluid into the plume. Testament to the insight and approach developed in these early models of plumes is that the essential theory remains unchanged and is widely applied today. We describe the foundations of plume theory and link the theoretical developments with the measurements made in experiments necessary to close these models before discussing some recent developments in plume theory, including an approach which generalizes results obtained separately for the Boussinesq and the non-Boussinesq plume cases. The theory presented - despite its simplicity - has been very successful at describing and explaining the behaviour of plumes across the wide range of scales they are observed. We present solutions to the coupled set of ordinary differential equations (the plume conservation equations) that Morton et al. (1956) derived from the Navier-Stokes equations which govern fluid motion. In order to describe and contrast the bulk behaviour of rising plumes from general area sources, we present closed-form solutions to the plume conservation equations that were achieved by solving for the variation with height of Morton's non-dimensional flux parameter Γ - this single flux parameter gives a unique representation of the behaviour of steady plumes and enables a characterization of the different types of plume. We discuss advantages of solutions in this form before describing extensions to plume theory and suggesting directions for new research. © 2010 The Author. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.
Resumo:
Ideally, one would like to perform image search using an intuitive and friendly approach. Many existing image search engines, however, present users with sets of images arranged in some default order on the screen, typically the relevance to a query, only. While this certainly has its advantages, arguably, a more flexible and intuitive way would be to sort images into arbitrary structures such as grids, hierarchies, or spheres so that images that are visually or semantically alike are placed together. This paper focuses on designing such a navigation system for image browsers. This is a challenging task because arbitrary layout structure makes it difficult - if not impossible - to compute cross-similarities between images and structure coordinates, the main ingredient of traditional layouting approaches. For this reason, we resort to a recently developed machine learning technique: kernelized sorting. It is a general technique for matching pairs of objects from different domains without requiring cross-domain similarity measures and hence elegantly allows sorting images into arbitrary structures. Moreover, we extend it so that some images can be preselected for instance forming the tip of the hierarchy allowing to subsequently navigate through the search results in the lower levels in an intuitive way. Copyright 2010 ACM.
Resumo:
Many visual datasets are traditionally used to analyze the performance of different learning techniques. The evaluation is usually done within each dataset, therefore it is questionable if such results are a reliable indicator of true generalization ability. We propose here an algorithm to exploit the existing data resources when learning on a new multiclass problem. Our main idea is to identify an image representation that decomposes orthogonally into two subspaces: a part specific to each dataset, and a part generic to, and therefore shared between, all the considered source sets. This allows us to use the generic representation as un-biased reference knowledge for a novel classification task. By casting the method in the multi-view setting, we also make it possible to use different features for different databases. We call the algorithm MUST, Multitask Unaligned Shared knowledge Transfer. Through extensive experiments on five public datasets, we show that MUST consistently improves the cross-datasets generalization performance. © 2013 Springer-Verlag.