991 resultados para Radar Braking Systems.


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Life's perfect partnership starts with the placenta. If we get this right, we have the best chance of healthy life. In preeclampsia, we have a failing placenta. Preeclampsia kills one pregnant woman every minute and the life expectancy of those who survive is greatly reduced. Preeclampsia is treated roughly the same way it was when Thomas Edison was making the first silent movie. Globally, millions of women risk death to give birth each year and almost 300,000 lose their lives in this process. Over half a million babies around the world die each year as a consequence of preeclampsia. Despite decades of research, we lack pharmacological agents to treat it. Maternal endothelial dysfunction is a central phenomenon responsible for the clinical signs of preeclampsia. In the late nineties, we discovered that vascular endothelial growth factor (VEGF) stimulated nitric oxide release. This led us to suggest that preeclampsia arises due to the loss of VEGF activity, possibly due to a rise in soluble Flt-1 (sFlt-1), the natural antagonist of VEGF. Researchers have shown that high sFlt-1 elicits preeclampsia-like signs in pregnant rats and sFlt-1 increases before the clinical signs of preeclampsia in pregnant women. We demonstrated that removing or reducing this culprit protein from preeclamptic placenta restored the angiogenic balance. Heme oxygenase-1 (HO-1 or Hmox1) that generates carbon monoxide (CO), biliverdin (rapidly converted to bilirubin) and iron is cytoprotective. We showed that the Hmox1/CO pathway prevents human placental injury caused by pro-inflammatory cytokines and suppresses sFlt-1 and soluble endoglin release, factors responsible for preeclampsia phenotypes. The other key enzyme we identified is the hydrogen sulfide generating cystathionine-gamma-lyase (CSE or Cth). These are the only two enzyme systems shown to suppress sFlt-1 and to act as protective pathways against preeclampsia phenotypes in animal models. We also showed that when hydrogen sulfide restores placental vasculature, it also improves lagging fetal growth. These molecules act as the inhibitor systems in pregnancy and when they fail, this triggers preeclampsia. Discovering that statins induce these enzymes led us to an RCT to develop a low-cost therapy (StAmP Trial) to prevent or treat preeclampsia. If you think of pregnancy as a car then preeclampsia is an accelerator–brake defect disorder. Inflammation, oxidative stress and an imbalance in the angiogenic milieu fuel the ‘accelerator’. It is the failure in the braking systems (the endogenous protective pathway) that results in the ‘accelerator’ going out of control until the system crashes, manifesting itself as preeclampsia.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although extensively studied within the lidar community, the multiple scattering phenomenon has always been considered a rare curiosity by radar meteorologists. Up to few years ago its appearance has only been associated with two- or three-body-scattering features (e.g. hail flares and mirror images) involving highly reflective surfaces. Recent atmospheric research aimed at better understanding of the water cycle and the role played by clouds and precipitation in affecting the Earth's climate has driven the deployment of high frequency radars in space. Examples are the TRMM 13.5 GHz, the CloudSat 94 GHz, the upcoming EarthCARE 94 GHz, and the GPM dual 13-35 GHz radars. These systems are able to detect the vertical distribution of hydrometeors and thus provide crucial feedbacks for radiation and climate studies. The shift towards higher frequencies increases the sensitivity to hydrometeors, improves the spatial resolution and reduces the size and weight of the radar systems. On the other hand, higher frequency radars are affected by stronger extinction, especially in the presence of large precipitating particles (e.g. raindrops or hail particles), which may eventually drive the signal below the minimum detection threshold. In such circumstances the interpretation of the radar equation via the single scattering approximation may be problematic. Errors will be large when the radiation emitted from the radar after interacting more than once with the medium still contributes substantially to the received power. This is the case if the transport mean-free-path becomes comparable with the instrument footprint (determined by the antenna beam-width and the platform altitude). This situation resembles to what has already been experienced in lidar observations, but with a predominance of wide- versus small-angle scattering events. At millimeter wavelengths, hydrometeors diffuse radiation rather isotropically compared to the visible or near infrared region where scattering is predominantly in the forward direction. A complete understanding of radiation transport modeling and data analysis methods under wide-angle multiple scattering conditions is mandatory for a correct interpretation of echoes observed by space-borne millimeter radars. This paper reviews the status of research in this field. Different numerical techniques currently implemented to account for higher order scattering are reviewed and their weaknesses and strengths highlighted. Examples of simulated radar backscattering profiles are provided with particular emphasis given to situations in which the multiple scattering contributions become comparable or overwhelm the single scattering signal. We show evidences of multiple scattering effects from air-borne and from CloudSat observations, i.e. unique signatures which cannot be explained by single scattering theory. Ideas how to identify and tackle the multiple scattering effects are discussed. Finally perspectives and suggestions for future work are outlined. This work represents a reference-guide for studies focused at modeling the radiation transport and at interpreting data from high frequency space-borne radar systems that probe highly opaque scattering media such as thick ice clouds or precipitating clouds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffraction tomographic imaging is applied to the imaging of shallowly buried targets with multi-bistatic arrays of transmitters and receivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A parametric study was carried out to investigate the effects on reconstructed images from a ground penetrating radar (GPR) due to (a) the centre frequency of the GPR excitation pulse, (b) the height of transmitting and receiving antennas above ground level, and (c) the proximity of the buried objects. An integrated software package was developed to streamline the computer simulation based on synthetic data generated by GPRMax.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulation has been widely used to estimate the benefits of Cooperative Systems (CS) based on Inter-Vehicular Communications (IVC). This paper presents a new architecture built with the SiVIC simulator and the RTMaps™ multisensors prototyping platform. We introduce several improvements from a previous similar architecture, regarding IVC modelisation and vehicles’ control. It has been tuned with on-road measurements to improve fidelity. We discuss the results of a freeway emergency braking scenario (EEBL) implemented to validate our architecture’s capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to promote the integrity of perception systems for outdoor unmanned ground vehicles (UGV) operating in challenging environmental conditions (presence of dust or smoke). The proposed technique automatically evaluates the consistency of the data provided by two sensing modalities: a 2D laser range finder and a millimetre-wave radar, allowing for perceptual failure mitigation. Experimental results, obtained with a UGV operating in rural environments, and an error analysis validate the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Operating in vegetated environments is a major challenge for autonomous robots. Obstacle detection based only on geometric features causes the robot to consider foliage, for example, small grass tussocks that could be easily driven through, as obstacles. Classifying vegetation does not solve this problem since there might be an obstacle hidden behind the vegetation. In addition, dense vegetation typically needs to be considered as an obstacle. This paper addresses this problem by augmenting probabilistic traversability map constructed from laser data with ultra-wideband radar measurements. An adaptive detection threshold and a probabilistic sensor model are developed to convert the radar data to occupancy probabilities. The resulting map captures the fine resolution of the laser map but clears areas from the traversability map that are induced by obstacle-free foliage. Experimental results validate that this method is able to improve the accuracy of traversability maps in vegetated environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a decade, embedded driving assistance systems were mainly dedicated to the management of short time events (lane departure, collision avoidance, collision mitigation). Recently a great number of projects have been focused on cooperative embedded devices in order to extend environment perception. Handling an extended perception range is important in order to provide enough information for both path planning and co-pilot algorithms which need to anticipate events. To carry out such applications, simulation has been widely used. Simulation is efficient to estimate the benefits of Cooperative Systems (CS) based on Inter-Vehicular Communications (IVC). This paper presents a new and modular architecture built with the SiVIC simulator and the RTMaps™ multi-sensors prototyping platform. A set of improvements, implemented in SiVIC, are introduced in order to take into account IVC modelling and vehicles’ control. These 2 aspects have been tuned with on-road measurements to improve the realism of the scenarios. The results obtained from a freeway emergency braking scenario are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mesoscale weather phenomena, such as the sea breeze circulation or lake effect snow bands, are typically too large to be observed at one point, yet too small to be caught in a traditional network of weather stations. Hence, the weather radar is one of the best tools for observing, analyzing and understanding their behavior and development. A weather radar network is a complex system, which has many structural and technical features to be tuned, from the location of each radar to the number of pulses averaged in the signal processing. These design parameters have no universal optimal values, but their selection depends on the nature of the weather phenomena to be monitored as well as on the applications for which the data will be used. The priorities and critical values are different for forest fire forecasting, aviation weather service or the planning of snow ploughing, to name a few radar-based applications. The main objective of the work performed within this thesis has been to combine knowledge of technical properties of the radar systems and our understanding of weather conditions in order to produce better applications able to efficiently support decision making in service duties for modern society related to weather and safety in northern conditions. When a new application is developed, it must be tested against ground truth . Two new verification approaches for radar-based hail estimates are introduced in this thesis. For mesoscale applications, finding the representative reference can be challenging since these phenomena are by definition difficult to catch with surface observations. Hence, almost any valuable information, which can be distilled from unconventional data sources such as newspapers and holiday shots is welcome. However, as important as getting data is to obtain estimates of data quality, and to judge to what extent the two disparate information sources can be compared. The presented new applications do not rely on radar data alone, but ingest information from auxiliary sources such as temperature fields. The author concludes that in the future the radar will continue to be a key source of data and information especially when used together in an effective way with other meteorological data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technical developments and advances that have taken place thus far are reviewed in those areas impacting future phased array active aperture radar systems. The areas covered are printed circuit antennas and antenna arrays, GaAs MMIC design and fabrication leading to affordable transmitter-receiver (T-R) modules, and novel hardware and software developments. The use of fiber-optic distribution networks to interconnect the monolithically integrated optical components with the T-R modules is discussed. Beamforming and sidelobe control techniques for active phased array systems are also examined.