945 resultados para Radar simulators
Resumo:
This paper considers two problems that frequently arise in dynamic discrete choice problems but have not received much attention with regard to simulation methods. The first problem is how to simulate unbiased simulators of probabilities conditional on past history. The second is simulating a discrete transition probability model when the underlying dependent variable is really continuous. Both methods work well relative to reasonable alternatives in the application discussed. However, in both cases, for this application, simpler methods also provide reasonably good results.
Resumo:
The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion. In this paper, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate. The approach is first validated using synthetic data. We then demonstrate its benefit using a mobile robot, equipped with a laser scanner and a radar, which operates in an outdoor environment in the presence of large clouds of airborne dust and smoke.
Resumo:
Anuradha Mathur and Dilip da Cunha theorise in their work on cities and flooding that it is not the floodwaters that threaten lives and homes, the real cause of danger in natural disaster is the fixity of modern civilisation. Their work traces the fluidity of the boundaries between 'dry' and 'wet' land challenging the deficiencies of traditional cartography in representing the extents of bodies of water. Mathur and da Cunha propose a process of unthinking to address the redevelopment of communities in the aftermath of natural disaster. By documenting the path of floodwaters in non-Euclidean space they propose a more appropriate response to flooding. This research focuses on the documentation of flooding in the interior of dwellings, which is an extreme condition of damage by external conditions in an environment designed to protect from these very elements. Because the floodwaters don't discriminate between the interior and the exterior, they move between structures with disregard for the systems of space we have in place. With the rapid clean up that follows flood damage, little material evidence is left for post mortem examination. This is especially the case for the flood damaged interior, piles of materials susceptible to the elements, furniture, joinery and personal objects line curbsides awaiting disposal. There is a missed opportunity in examining the interior in the after math of flood, in the way that Mathur and Dilip investigate floods and the design of cities, the flooded interior proffers an undersigned interior to study. In the absence of intact flood damaged interior, this research relies on two artists' documentation of the flooded interior. The first case study is the mimetic scenographic interiors of a flood-damaged office exhibited in the Bangkok art gallery by the group _Proxy in 2011. The second case study is Robert Polidori's photographic exhibition in New Orleans, described by Julianna Preston as, 'a series of interiors undetected by satellite imaging or storm radar. More telling, more dramatic, more unnerving, more alarming, they force a disturbance of what is familiar'.
Resumo:
Introduction A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. Methods A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Results Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Conclusions Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment.
Empirical vehicle-to-vehicle pathloss modeling in highway, suburban and urban environments at 5.8GHz
Resumo:
In this paper, we present a pathloss characterization for vehicle-to-vehicle (V2V) communications based on empirical data collected from extensive measurement campaign performed under line-of-sight (LOS), non-line-of-sight (NLOS) and varying traffic densities. The experiment was conducted in three different V2V propagation environments: highway, suburban and urban at 5.8GHz. We developed pathloss models for each of the three different V2V environments considered. Based on a log-distance power law model, the values for the pathloss exponent and the standard deviation of shadowing were reported. The average pathloss exponent ranges from 1.77 for highway, 1.68 for the urban to 1.53 for the suburban environment. The reported results can contribute to vehicular network (VANET) simulators and can be used by system designers to develop, evaluate and validate new protocols and system designs under realistic propagation conditions.
Resumo:
Summary form only given. Geometric simplicity, efficiency and polarization purity make slot antenna arrays ideal solutions for many radar, communications and navigation applications, especially when high power, light weight and limited scan volume are priorities. Resonant arrays of longitudinal slots have a slot spacing of one-half guide wavelength at the design frequency, so that the slots are located at the standing wave peaks. Planar arrays are implemented using a number of rectangular waveguides (branch line guides), arranged side-by-side, while waveguides main lines located behind and at right angles to the branch lines excite the radiating waveguides via centered-inclined coupling slots. Planar slotted waveguide arrays radiate broadside beams and all radiators are designed to be in phase.
Resumo:
Hydraulic conductivity (K) fields are used to parameterize groundwater flow and transport models. Numerical simulations require a detailed representation of the K field, synthesized to interpolate between available data. Several recent studies introduced high-resolution K data (HRK) at the Macro Dispersion Experiment (MADE) site, and used ground-penetrating radar (GPR) to delineate the main structural features of the aquifer. This paper describes a statistical analysis of these data, and the implications for K field modeling in alluvial aquifers. Two striking observations have emerged from this analysis. The first is that a simple fractional difference filter can have a profound effect on data histograms, organizing non-Gaussian ln K data into a coherent distribution. The second is that using GPR facies allows us to reproduce the significantly non-Gaussian shape seen in real HRK data profiles, using a simulated Gaussian ln K field in each facies. This illuminates a current controversy in the literature, between those who favor Gaussian ln K models, and those who observe non-Gaussian ln K fields. Both camps are correct, but at different scales.
Resumo:
This paper presents an overview of the strengths and limitations of existing and emerging geophysical tools for landform studies. The objectives are to discuss recent technical developments and to provide a review of relevant recent literature, with a focus on propagating field methods with terrestrial applications. For various methods in this category, including ground-penetrating radar (GPR), electrical resistivity (ER), seismics, and electromagnetic (EM) induction, the technical backgrounds are introduced, followed by section on novel developments relevant to landform characterization. For several decades, GPR has been popular for characterization of the shallow subsurface and in particular sedimentary systems. Novel developments in GPR include the use of multi-offset systems to improve signal-to-noise ratios and data collection efficiency, amongst others, and the increased use of 3D data. Multi-electrode ER systems have become popular in recent years as they allow for relatively fast and detailed mapping. Novel developments include time-lapse monitoring of dynamic processes as well as the use of capacitively-coupled systems for fast, non-invasive surveys. EM induction methods are especially popular for fast mapping of spatial variation, but can also be used to obtain information on the vertical variation in subsurface electrical conductivity. In recent years several examples of the use of plane wave EM for characterization of landforms have been published. Seismic methods for landform characterization include seismic reflection and refraction techniques and the use of surface waves. A recent development is the use of passive sensing approaches. The use of multiple geophysical methods, which can benefit from the sensitivity to different subsurface parameters, is becoming more common. Strategies for coupled and joint inversion of complementary datasets will, once more widely available, benefit the geophysical study of landforms.Three cases studies are presented on the use of electrical and GPR methods for characterization of landforms in the range of meters to 100. s of meters in dimension. In a study of polygonal patterned ground in the Saginaw Lowlands, Michigan, USA, electrical resistivity tomography was used to characterize differences in subsurface texture and water content associated with polygon-swale topography. Also, a sand-filled thermokarst feature was identified using electrical resistivity data. The second example is on the use of constant spread traversing (CST) for characterization of large-scale glaciotectonic deformation in the Ludington Ridge, Michigan. Multiple CST surveys parallel to an ~. 60. m high cliff, where broad (~. 100. m) synclines and narrow clay-rich anticlines are visible, illustrated that at least one of the narrow structures extended inland. A third case study discusses internal structures of an eolian dune on a coastal spit in New Zealand. Both 35 and 200. MHz GPR data, which clearly identified a paleosol and internal sedimentary structures of the dune, were used to improve understanding of the development of the dune, which may shed light on paleo-wind directions.
Resumo:
Full-resolution 3D Ground-Penetrating Radar (GPR) data were combined with high-resolution hydraulic conductivity (K) data from vertical Direct-Push (DP) profiles to characterize a portion of the highly heterogeneous MAcro Dispersion Experiment (MADE) site. This is an important first step to better understand the influence of aquifer heterogeneities on observed anomalous transport. Statistical evaluation of DP data indicates non-normal distributions that have much higher similarity within each GPR facies than between facies. The analysis of GPR and DP data provides high-resolution estimates of the 3D geometry of hydrostratigraphic zones, which can then be populated with stochastic K fields. The lack of such estimates has been a significant limitation for testing and parameterizing a range of novel transport theories at sites where the traditional advection-dispersion model has proven inadequate.
Resumo:
Three thousand liters of water were infiltrated from a 4 m diameter pond to track flow and transport inside fractured carbonates with 20-40 % porosity. Sixteen time-lapse 3D Ground Penetrating Radar (GPR) surveys with repetition intervals between 2 hrs and 5 days monitored the spreading of the water bulb in the subsurface. Based on local travel time shifts between repeated GPR survey pairs, localized changes of volumetric water content can be related to the processes of wetting, saturation and drainage. Deformation bands consisting of thin sub vertical sheets of crushed grains reduce the magnitude of water content changes but enhance flow in sheet parallel direction. This causes an earlier break through across a stratigraphic boundary compared to porous limestone without deformation bands. This experiment shows how time-lapse 3D GPR or 4D GPR can non-invasively track ongoing flow processes in rock-volumes of over 100 m3.
Resumo:
This paper is not about the details of yet another robot control system, but rather the issues surrounding realworld robotic implementation. It is a fact that in order to realise a future where robots co-exist with people in everyday places, we have to pass through a developmental phase that involves some risk. Putting a “Keep Out, Experiment in Progress” sign on the door is no longer possible since we are now at a level of capability that requires testing over long periods of time in complex realistic environments that contain people. We all know that controlling the risk is important – a serious accident could set the field back globally – but just as important is convincing others that the risks are known and controlled. In this article, we describe our experience going down this path and we show that mobile robotics research health and safety assessment is still unexplored territory in universities and is often ignored. We hope that the article will make robotics research labs in universities around the world take note of these issues rather than operating under the radar to prevent any catastrophic accidents.
Resumo:
This paper presents a visual SLAM method for temporary satellite dropout navigation, here applied on fixed- wing aircraft. It is designed for flight altitudes beyond typical stereo ranges, but within the range of distance measurement sensors. The proposed visual SLAM method consists of a common localization step with monocular camera resectioning, and a mapping step which incorporates radar altimeter data for absolute scale estimation. With that, there will be no scale drift of the map and the estimated flight path. The method does not require simplifications like known landmarks and it is thus suitable for unknown and nearly arbitrary terrain. The method is tested with sensor datasets from a manned Cessna 172 aircraft. With 5% absolute scale error from radar measurements causing approximately 2-6% accumulation error over the flown distance, stable positioning is achieved over several minutes of flight time. The main limitations are flight altitudes above the radar range of 750 m where the monocular method will suffer from scale drift, and, depending on the flight speed, flights below 50 m where image processing gets difficult with a downwards-looking camera due to the high optical flow rates and the low image overlap.
Resumo:
Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.
Resumo:
In an essay, "The Books of Last Things", Delia Falconer discusses the emergence of a new genre in publishing - microhistories. She cites a number of recent titles in non-fiction and fiction - Longitude, Cod, Tulips, Pushkin's Button, Nathaniel's Nutmeg, Zarafa, The Surgeon of Crowthorne, The Potato, The Perfect Storm. Delia Falconer observes of this tradition: "One has the sense, reading these books, of a surprising weight, of pleasant shock. In part, it is because we are looking at things which are generally present around us, but modestly out of sight and mind - historical nitty gritty like cod, potatoes, longitudinal clocks - which the authors have thrust suddenly, like a Biblical visitation of frogs or locusts, in our face. Things like spice and buttons and clocks are generally seen to enable history on the large scale, but are not often viewed as its worthy subjects. And by the same grand logic of history, more unusual phenomena like cabinets of curiosities or glass-making or farm lore or sailors' knots are simply odd blips on its radar screen, interesting footnotes. These new books, microhistories, reverse the usual order of history, which argues from the general to the particular, in order to prove its inevitable progress. They start from the footnotes. But by reversing the process, and walking through the back door of history, you don't necessarily end up at the front of the same house." Delia Falconer speculates about the reasons for the popularity of microhistories. She concludes: "I would like to think that reading them is not simply an exercise in nostalgia, but a challenge to the present". In Mauve, Simon Garfield provides a new way of thinking and writing about the history of intellectual property. Instead of providing a grand historical narrative of intellectual property, he tells the story of a particular invention, and its exploitation. Simon Garfield relates how English chemist William Perkin accidentally discovered a way to mass-produce colour mauve in a factory. Working on a treatment for malaria in his London home laboratory, Perkin failed to produce artificial quinine. Instead he created a dark oily sludge that turned silk a beautiful light purple. The colour was unique and became the most desirable shade in the fashion houses of Paris and London. ... The book Mauve will have a number of contemporary resonances for intellectual property lawyers and academics. Simon Garfield emphasizes the difficulties inherent in commercialising an invention and managing intellectual property. He investigates the uneasy collaboration between industry and science. Simon Garfield suggests that complaints about the efficacy of patent offices are perennial. He also highlights the problems faced by courts and law-makers in accommodating new technologies within the logic of patent law. In his elegant microhistory of the colour mauve, Simon Garfield confirms the conclusion of Brad Sherman and Lionel Bently that many aspects of modern intellectual property law can only be understood through an understanding of the past: "The image of intellectual property law that developed during the 19th century and the narrative of identity which this engendered played and continue to play an important role in the way we think about and understand intellectual property law".