157 resultados para gain-coupling DFB laser


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parallel interleaved converters are finding more applications everyday, for example they are frequently used for VRMs on PC main boards mainly to obtain better transient response. Parallel interleaved converters can have their inductances uncoupled, directly coupled or inversely coupled, all of which have different applications with associated advantages and disadvantages. Coupled systems offer more control over converter features, such as ripple currents, inductance volume and transient response. To be able to gain an intuitive understanding of which type of parallel interleaved converter, what amount of coupling, what number of levels and how much inductance should be used for different applications a simple equivalent model is needed. As all phases of an interleaved converter are supposed to be identical, the equivalent model is nothing more than a separate inductance which is common to all phases. Without utilising this simplification the design of a coupled system is quite daunting. Being able to design a coupled system involves solving and understanding the RMS currents of the input, individual phase (or cell) and output. A procedure using this equivalent model and a small amount of modulo arithmetic is detailed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

YBCO thin films were fabricated by laser deposition, in situ on MgO substrates, using both O2 and N2O as process gas. Films with Tc above 90 K and jc of 106 A/cm2 at 77 K were grown in oxygen at a substrate temperature of 765 °C. Using N2O, the optimum substrate temperature was 745 °C, giving a Tc of 87 K. At lower temperatures, the films made in N2O had higher Tc (79 K) than the films made in oxygen (66 K). SEM and STM investigations of the film surfaces showed the films to consist of a comparatively smooth background surface and a distribution of larger particles. Both the particle size and the distribution density depended on the substrate temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE To investigate the utility of using non-contact laser-scanning confocal microscopy (NC-LSCM), compared with the more conventional contact laser-scanning confocal microscopy (C-LSCM), for examining corneal substructures in vivo. METHODS An attempt was made to capture representative images from the tear film and all layers of the cornea of a healthy, 35 year old female, using both NC-LSCM and C-LSCM, on separate days. RESULTS Using NC-LSCM, good quality images were obtained of the tear film, stroma, and a section of endothelium, but the corneal depth of the images of these various substructures could not be ascertained. Using C-LSCM, good quality, full-field images were obtained of the epithelium, subbasal nerve plexus, stroma, and endothelium, and the corneal depth of each of the captured images could be ascertained. CONCLUSIONS NC-LSCM may find general use for clinical examination of the tear film, stroma and endothelium, with the caveat that the depth of stromal images cannot be determined when using this technique. This technique also facilitates image capture of oblique sections of multiple corneal layers. The inability to clearly and consistently image thin corneal substructures - such as the tear film, subbasal nerve plexus and endothelium - is a key limitation of NC-LSCM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the use of subarrays as array elements. Benefits of such a concept include improved gain in any direction without significantly increasing the overall size of the array and enhanced pattern control. The architecture for an array of subarrays will be discussed via a systems approach. Individual system designs are explored in further details and proof of principle is illustrated through a manufactured examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work considers the problem of building high-fidelity 3D representations of the environment from sensor data acquired by mobile robots. Multi-sensor data fusion allows for more complete and accurate representations, and for more reliable perception, especially when different sensing modalities are used. In this paper, we propose a thorough experimental analysis of the performance of 3D surface reconstruction from laser and mm-wave radar data using Gaussian Process Implicit Surfaces (GPIS), in a realistic field robotics scenario. We first analyse the performance of GPIS using raw laser data alone and raw radar data alone, respectively, with different choices of covariance matrices and different resolutions of the input data. We then evaluate and compare the performance of two different GPIS fusion approaches. The first, state-of-the-art approach directly fuses raw data from laser and radar. The alternative approach proposed in this paper first computes an initial estimate of the surface from each single source of data, and then fuses these two estimates. We show that this method outperforms the state of the art, especially in situations where the sensors react differently to the targets they perceive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Field robots often rely on laser range finders (LRFs) to detect obstacles and navigate autonomously. Despite recent progress in sensing technology and perception algorithms, adverse environmental conditions, such as the presence of smoke, remain a challenging issue for these robots. In this paper, we investigate the possibility to improve laser-based perception applications by anticipating situations when laser data are affected by smoke, using supervised learning and state-of-the-art visual image quality analysis. We propose to train a k-nearest-neighbour (kNN) classifier to recognise situations where a laser scan is likely to be affected by smoke, based on visual data quality features. This method is evaluated experimentally using a mobile robot equipped with LRFs and a visual camera. The strengths and limitations of the technique are identified and discussed, and we show that the method is beneficial if conservative decisions are the most appropriate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an approach to promote the integrity of perception systems for outdoor unmanned ground vehicles (UGV) operating in challenging environmental conditions (presence of dust or smoke). The proposed technique automatically evaluates the consistency of the data provided by two sensing modalities: a 2D laser range finder and a millimetre-wave radar, allowing for perceptual failure mitigation. Experimental results, obtained with a UGV operating in rural environments, and an error analysis validate the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Camera-laser calibration is necessary for many robotics and computer vision applications. However, existing calibration toolboxes still require laborious effort from the operator in order to achieve reliable and accurate results. This paper proposes algorithms that augment two existing trustful calibration methods with an automatic extraction of the calibration object from the sensor data. The result is a complete procedure that allows for automatic camera-laser calibration. The first stage of the procedure is automatic camera calibration which is useful in its own right for many applications. The chessboard extraction algorithm it provides is shown to outperform openly available techniques. The second stage completes the procedure by providing automatic camera-laser calibration. The procedure has been verified by extensive experimental tests with the proposed algorithms providing a major reduction in time required from an operator in comparison to manual methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Highlights • Diabetic foot ulcers (DFUs) are a major complication of diabetes. • We describe the development of next-generation technologies for DFU repair. • We highlight the modest success of growth factor-, scaffold-, and cell-based DFU therapies. • We rationalize that combination therapies will be necessary to enable effective and reliable DFU repair.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Gray and McNaughton’s revised RST, this study investigated the extent to which the Behavioural Approach System (BAS) and the Fight-Flight-Freeze System (FFFS) influence the processing of gain-framed and loss-framed road safety messages and subsequent message acceptance. It was predicted that stronger BAS sensitivity and FFFS sensitivity would be associated with greater processing and acceptance of the gain-framed messages and loss-framed messages, respectively. Young drivers (N = 80, aged 17–25 years) viewed one of four road safety messages and completed a lexical decision task to assess message processing. Both self-report (e.g., Corr-Cooper RST-PQ) and behavioural measures (i.e., CARROT and Q-Task) were used to assess BAS and FFFS traits. Message acceptance was measured via self-report ratings of message effectiveness, behavioural intentions, attitudes and subsequent driving behaviour. The results are discussed in the context of the effect that differences in reward and punishment sensitivities may have on message processing and message acceptance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radical-directed dissociation of gas phase ions is emerging as a powerful and complementary alternative to traditional tandem mass spectrometric techniques for biomolecular structural analysis. Previous studies have identified that coupling of 2-[(2,2,6,6-tetramethylpiperidin-1-oxyl)methyl] benzoic acid (TEMPO-Bz) to the N-terminus of a peptide introduces a labile oxygen-carbon bond that can be selectively activated upon collisional activation to produce a radical ion. Here we demonstrate that structurally-defined peptide radical ions can also be generated upon UV laser photodissociation of the same TEMPO-Bz derivatives in a linear ion-trap mass spectrometer. When subjected to further mass spectrometric analyses, the radical ions formed by a single laser pulse undergo identical dissociations as those formed by collisional activation of the same precursor ion, and can thus be used to derive molecular structure. Mapping the initial radical formation process as a function of photon energy by photodissociation action spectroscopy reveals that photoproduct formation is selective but occurs only in modest yield across the wavelength range (300-220 nm), with the photoproduct yield maximised between 235 and 225 nm. Based on the analysis of a set of model compounds, structural modifications to the TEMPO-Bz derivative are suggested to optimise radical photoproduct yield. Future development of such probes offers the advantage of increased sensitivity and selectivity for radical-directed dissociation. © 2014 the Owner Societies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring Earth material behaviour on time scales of millions of years transcends our current capability in the laboratory. We review an alternative path considering multiscale and multiphysics approaches with quantitative structure-property relationships. This approach allows a sound basis to incorporate physical principles such as chemistry, thermodynamics, diffusion and geometry-energy relations into simulations and data assimilation on the vast range of length and time scales encountered in the Earth. We identify key length scales for Earth systems processes and find a substantial scale separation between chemical, hydrous and thermal diffusion. We propose that this allows a simplified two-scale analysis where the outputs from the micro-scale model can be used as inputs for meso-scale simulations, which then in turn becomes the micro-model for the next scale up. We present two fundamental theoretical approaches to link the scales through asymptotic homogenisation from a macroscopic thermodynamic view and percolation renormalisation from a microscopic, statistical mechanics view.