59 resultados para feature based cost
Resumo:
Board-level optical links are an attractive alternative to their electrical counterparts as they provide higher bandwidth and lower power consumption at high data rates. However, on-board optical technology has to be cost-effective to be commercially deployed. This study presents a chip-to-chip optical interconnect formed on an optoelectronic printed circuit board that uses a simple optical coupling scheme, cost-effective materials and is compatible with well-established manufacturing processes common to the electronics industry. Details of the link architecture, modelling studies of the link's frequency response, characterisation of optical coupling efficiencies and dynamic performance studies of this proof-of-concept chip-to-chip optical interconnect are reported. The fully assembled link exhibits a -3 dBe bandwidth of 9 GHz and -3 dBo tolerances to transverse component misalignments of ±25 and ±37 μm at the input and output waveguide interfaces, respectively. The link has a total insertion loss of 6 dBo and achieves error-free transmission at a 10 Gb/s data rate with a power margin of 11.6 dBo for a bit-error-rate of 10 -12. The proposed architecture demonstrates an integration approach for high-speed board-level chip-to-chip optical links that emphasises component simplicity and manufacturability crucial to the migration of such technology into real-world commercial systems. © 2012 The Institution of Engineering and Technology.
Resumo:
The existing machine vision-based 3D reconstruction software programs provide a promising low-cost and in some cases automatic solution for infrastructure as-built documentation. However in several steps of the reconstruction process, they only rely on detecting and matching corner-like features in multiple views of a scene. Therefore, in infrastructure scenes which include uniform materials and poorly textured surfaces, these programs fail with high probabilities due to lack of feature points. Moreover, except few programs that generate dense 3D models through significantly time-consuming algorithms, most of them only provide a sparse reconstruction which does not necessarily include required points such as corners or edges; hence these points have to be manually matched across different views that could make the process considerably laborious. To address these limitations, this paper presents a video-based as-built documentation method that automatically builds detailed 3D maps of a scene by aligning edge points between video frames. Compared to corner-like features, edge points are far more plentiful even in untextured scenes and often carry important semantic associations. The method has been tested for poorly textured infrastructure scenes and the results indicate that a combination of edge and corner-like features would allow dealing with a broader range of scenes.
Resumo:
We develop a group-theoretical analysis of slow feature analysis for the case where the input data are generated by applying a set of continuous transformations to static templates. As an application of the theory, we analytically derive nonlinear visual receptive fields and show that their optimal stimuli, as well as the orientation and frequency tuning, are in good agreement with previous simulations of complex cells in primary visual cortex (Berkes and Wiskott, 2005). The theory suggests that side and end stopping can be interpreted as a weak breaking of translation invariance. Direction selectivity is also discussed. © 2011 Massachusetts Institute of Technology.
Resumo:
This paper presents a novel coarse-to-fine global localization approach inspired by object recognition and text retrieval techniques. Harris-Laplace interest points characterized by scale-invariant transformation feature descriptors are used as natural landmarks. They are indexed into two databases: a location vector space model (LVSM) and a location database. The localization process consists of two stages: coarse localization and fine localization. Coarse localization from the LVSM is fast, but not accurate enough, whereas localization from the location database using a voting algorithm is relatively slow, but more accurate. The integration of coarse and fine stages makes fast and reliable localization possible. If necessary, the localization result can be verified by epipolar geometry between the representative view in the database and the view to be localized. In addition, the localization system recovers the position of the camera by essential matrix decomposition. The localization system has been tested in indoor and outdoor environments. The results show that our approach is efficient and reliable. © 2006 IEEE.
Resumo:
This paper presents a novel coarse-to-fine global localization approach that is inspired by object recognition and text retrieval techniques. Harris-Laplace interest points characterized by SIFT descriptors are used as natural land-marks. These descriptors are indexed into two databases: an inverted index and a location database. The inverted index is built based on a visual vocabulary learned from the feature descriptors. In the location database, each location is directly represented by a set of scale invariant descriptors. The localization process consists of two stages: coarse localization and fine localization. Coarse localization from the inverted index is fast but not accurate enough; whereas localization from the location database using voting algorithm is relatively slow but more accurate. The combination of coarse and fine stages makes fast and reliable localization possible. In addition, if necessary, the localization result can be verified by epipolar geometry between the representative view in database and the view to be localized. Experimental results show that our approach is efficient and reliable. ©2005 IEEE.
Resumo:
This paper presents a novel approach using combined features to retrieve images containing specific objects, scenes or buildings. The content of an image is characterized by two kinds of features: Harris-Laplace interest points described by the SIFT descriptor and edges described by the edge color histogram. Edges and corners contain the maximal amount of information necessary for image retrieval. The feature detection in this work is an integrated process: edges are detected directly based on the Harris function; Harris interest points are detected at several scales and Harris-Laplace interest points are found using the Laplace function. The combination of edges and interest points brings efficient feature detection and high recognition ratio to the image retrieval system. Experimental results show this system has good performance. © 2005 IEEE.
Resumo:
Chemical looping combustion (CLC) is a means of combusting carbonaceous fuels, which inherently separates the greenhouse gas carbon dioxide from the remaining combustion products, and has the potential to be used for the production of high-purity hydrogen. Iron-based oxygen carriers for CLC have been subject to considerable work; however, there are issues regarding the lifespan of iron-based oxygen carriers over repeated cycles. In this work, haematite (Fe2O3) was reduced in an N2+CO+CO2 mixture within a fluidised bed at 850°C, and oxidised back to magnetite (Fe3O4) in a H2O+N2 mixture, with the subsequent yield of hydrogen during oxidation being of interest. Subsequent cycles started from Fe3O4 and two transition regimes were studied; Fe3O4↔Fe0.947O and Fe 3O4↔Fe. Particles were produced by mechanical mixing and co-precipitation. In the case of co-precipitated particles, Al was added such that the ratio of Fe:Al by weight was 9:1, and the final pH of the particles during precipitation was investigated for its subsequent effect on reactivity. This paper shows that co-precipitated particles containing additives such as Al may be able to achieve consistently high H2 yields when cycling between Fe3O4 and Fe, and that these yields are a function of the ratio of [CO2] to [CO] during reduction, where thermodynamic arguments suggest that the yield should be independent of this ratio. A striking feature with our materials was that particles made by mechanical mixing performed much better than those made by co-precipitation when cycling between Fe3O4 and Fe0.947O, but much worse than co-precipitated particles when cycling between Fe3O 4 and Fe.
Resumo:
An experimental evaluation of small two-phase induction motor drives operating with different inverter topologies is described. Results show that a PWM-based four-switch inverter, having only low-side switches is attractive for high-speed low-cost applications where speeds greater than those that can be obtained using single phase induction motors are required.
Resumo:
This paper discusses the application of Discrete Event Simulation (DES) in modelling the complex relationship between patient types, case-mix and operating theatre allocation in a large National Health Service (NHS) Trust in London. The simulation model that was constructed described the main features of nine theatres, focusing on operational processes and patient throughput times. The model was used to test three scenarios of case-mix and to demonstrate the potential of using simulation modelling as a cost effective method for understanding the issues of healthcare operations management and the role of simulation techniques in problem solving. The results indicated that removing all day cases will reduce patient throughput by 23.3% and the utilization of the orthopaedic theatre in particular by 6.5%. This represents a case example of how DES can be used by healthcare managers to inform decision making. © 2008 IEEE.
Resumo:
The combination of high frequency, high power, high efficiency capabilities is a feature of vacuum tube technology. For most of applications, large bandwidths are required, and therefore the modulation method should also allow large bandwidth operation. Optically modulated cold cathodes, avoiding the use of resonant cavities, should satisfy this requirement. This is the reason why we have developed carbon nanotube based photocathode.© 2009 IEEE.
Resumo:
Lab-on-a-chip (LOC) is one of the most important microsystem applications with promise for use in microanalysis, drug development, diagnosis of illness and diseases etc. LOC typically consists of two main components: microfluidics and sensors. Integration of microfluidics and sensors on a single chip can greatly enhance the efficiency of biochemical reactions and the sensitivity of detection, increase the reaction/detection speed, and reduce the potential cross-contamination, fabrication time and cost etc. However, the mechanisms generally used for microfluidics and sensors are different, making the integration of the two main components complicated and increases the cost of the systems. A lab-on-a-chip system based on a single surface acoustic wave (SAW) actuation mechanism is proposed. SAW devices were fabricated on nanocrystalline ZnO thin films deposited on Si substrates using sputtering. Coupling of acoustic waves into a liquid induces acoustic streaming and motion of droplets. A streaming velocity up to ∼ 5cm/s and droplet pumping speeds of ∼lcm/s were obtained. It was also found that a higher order mode wave, the Sezawa wave is more effective in streaming and transportation of microdroplets. The ZnO SAW sensor has been used for prostate antigen/antibody biorecognition systems, demonstrated the feasibility of using a single actuation mechanism for lab-on-a-chip applications. © 2010 Materials Research Society.
Resumo:
We present a new software framework for the implementation of applications that use stencil computations on block-structured grids to solve partial differential equations. A key feature of the framework is the extensive use of automatic source code generation which is used to achieve high performance on a range of leading multi-core processors. Results are presented for a simple model stencil running on Intel and AMD CPUs as well as the NVIDIA GT200 GPU. The generality of the framework is demonstrated through the implementation of a complete application consisting of many different stencil computations, taken from the field of computational fluid dynamics. © 2010 IEEE.
Resumo:
Effective dialogue management is critically dependent on the information that is encoded in the dialogue state. In order to deploy reinforcement learning for policy optimization, dialogue must be modeled as a Markov Decision Process. This requires that the dialogue statemust encode all relevent information obtained during the dialogue prior to that state. This can be achieved by combining the user goal, the dialogue history, and the last user action to form the dialogue state. In addition, to gain robustness to input errors, dialogue must be modeled as a Partially Observable Markov Decision Process (POMDP) and hence, a distribution over all possible states must be maintained at every dialogue turn. This poses a potential computational limitation since there can be a very large number of dialogue states. The Hidden Information State model provides a principled way of ensuring tractability in a POMDP-based dialogue model. The key feature of this model is the grouping of user goals into partitions that are dynamically built during the dialogue. In this article, we extend this model further to incorporate the notion of complements. This allows for a more complex user goal to be represented, and it enables an effective pruning technique to be implemented that preserves the overall system performance within a limited computational resource more effectively than existing approaches. © 2011 ACM.
Resumo:
Purpose - The purpose of this paper is to develop a framework of total acquisition cost of overseas outsourcing/sourcing in manufacturing industry. This framework contains categorized cost items that may occur during the overseas outsourcing/sourcing process. The framework was tested by a case study to establish both its feasibility and usability. Design/methodology/approach - First, interviews were carried out with practitioners who have the experience of overseas outsourcing/sourcing in order to obtain inputs from industry. The framework was then built up based on combined inputs from literature and from practitioners. Finally, the framework was tested by a case study in a multinational high-tech manufacturer to establish both its feasibility and usability. Findings - A practical barrier for implementing this framework is shortage of information. The predictability of the cost items in the framework varies. How to deal with the trade off between accuracy and applicability is a problem needed to be solved in the future research. Originality/value - There are always limitations to the generalizations that can be made from just one case. However, despite these limitations, this case study is believed to have shown the general requirement of modeling the uncertainty and dealing with the dilemma between accuracy and applicability in practice. © Emerald Group Publishing Limited.