945 resultados para Radar simulators
Resumo:
The papers in this special issue focus on the topic of location awareness for radio and networks. Localization-awareness using radio signals stands to revolutionize the fields of navigation and communication engineering. It can be utilized to great effect in the next generation of cellular networks, mining applications, health-care monitoring, transportation and intelligent highways, multi-robot applications, first responders operations, military applications, factory automation, building and environmental controls, cognitive wireless networks, commercial and social network applications, and smart spaces. A multitude of technologies can be used in location-aware radios and networks, including GNSS, RFID, cellular, UWB, WLAN, Bluetooth, cooperative localization, indoor GPS, device-free localization, IR, Radar, and UHF. The performances of these technologies are measured by their accuracy, precision, complexity, robustness, scalability, and cost. Given the many application scenarios across different disciplines, there is a clear need for a broad, up-to-date and cogent treatment of radio-based location awareness. This special issue aims to provide a comprehensive overview of the state-of-the-art in technology, regulation, and theory. It also presents a holistic view of research challenges and opportunities in the emerging areas of localization.
Resumo:
Inkjet printing is proposed as a means to create the resistively loaded elements of a frequency selective surface (FSS) which suppresses radar backscatter when placed above a metal ground plane. Spectral transmission and reflection measurements from 9 to 18 GHz show that the dot density of the printed features and the volume ratio of an aqueous vehicle and nano-silver (Ag) ink mixture can be selected to obtain surface resistances in the range 1.2-200 Ω/sq.
Resumo:
Rapid in situ diagnosis of damage is a key issue in the preservation of stone-built cultural heritage. This is evident in the increasing number of congresses, workshops and publications dealing with this issue. With this increased activity has come, however, the realisation that for many culturally significant artefacts it is not possible either to remove samples for analysis or to affix surface markers for measurement. It is for this reason that there has been a growth of interest in non-destructive and minimally invasive techniques for characterising internal and external stone condition. With this interest has come the realisation that no single technique can adequately encompass the wide variety of parameters to be assessed or provide the range of information required to identify appropriate conservation. In this paper we describe a strategy to address these problems through the development of an integrated `tool kit' of measurement and analytical techniques aimed specifically at linking object-specific research to appropriate intervention. The strategy is based initially upon the acquisition of accurate three-dimensional models of stone-built heritage at different scales using a combination of millimetre accurate LiDAR and sub-millimetre accurate Object Scanning that can be exported into a GIS or directly into CAD. These are currently used to overlay information on stone characteristics obtained through a combination of Ground Penetrating Radar, Surface Permeametry, Colorimetry and X-ray Fluorescence, but the possibility exists for adding to this array of techniques as appropriate. In addition to the integrated three-dimensional data array provided by superimposition upon Digital Terrain Models, there is the capability of accurate re-measurement to show patterns of surface loss and changes in material condition over time. Thus it is possible to both record and base-line condition and to identify areas that require either preventive maintenance or more significant pre-emptive intervention. In pursuit of these goals the authors are developing, through a UK Government supported collaboration between University Researchers and Conservation Architects, commercially viable protocols for damage diagnosis, condition monitoring and eventually mechanisms for prioritizing repairs to stone-built heritage. The understanding is, however, that such strategies are not age-constrained and can ultimately be applied to structures of any age.
Resumo:
DRAM technology faces density and power challenges to increase capacity because of limitations of physical cell design. To overcome these limitations, system designers are exploring alternative solutions that combine DRAM and emerging NVRAM technologies. Previous work on heterogeneous memories focuses, mainly, on two system designs: PCache, a hierarchical, inclusive memory system, and HRank, a flat, non-inclusive memory system. We demonstrate that neither of these designs can universally achieve high performance and energy efficiency across a suite of HPC workloads. In this work, we investigate the impact of a number of multilevel memory designs on the performance, power, and energy consumption of applications. To achieve this goal and overcome the limited number of available tools to study heterogeneous memories, we created HMsim, an infrastructure that enables n-level, heterogeneous memory studies by leveraging existing memory simulators. We, then, propose HpMC, a new memory controller design that combines the best aspects of existing management policies to improve performance and energy. Our energy-aware memory management system dynamically switches between PCache and HRank based on the temporal locality of applications. Our results show that HpMC reduces energy consumption from 13% to 45% compared to PCache and HRank, while providing the same bandwidth and higher capacity than a conventional DRAM system.
Resumo:
This paper presents a critical analysis of ultrawideband (UWB) and considers the turbulent journey it has had from the Federal Communications Commission's bandwidth allocation in 2002 to today. It analyzes the standards, the standoffs, and the stalemate in standardization activities and investigates the past and present research and commercial activities in realizing the UWB dream. In this paper, statistical evidence is presented to depict UWB's changing fortunes and is utilized as an indicator of future prominence. This paper reviews some of the opinions and remarks from commentators and analyzes predictions that were made. Finally, it presents possible ways forward to reignite the high-data-rate UWB standardization pursuit.
Resumo:
A 94 GHz waveguide Rotman lens is described which can be used to implement an amplitude comparison monopulse RADAR. In transmit mode, adjacent dual beam ports are excited with equal amplitude and phase to form a sum radiation pattern, and in receive mode, the outputs of the beam port pairs are combined using magic tees to provide a sum and a difference signal which can be used to calculate an angular error estimate for target acquisition and tracking. This approach provides an amplitude comparison monopulse system which can be scanned in azimuth and which has a low component count, with no requirement for phase shift circuitry in the array feed lines, making it suitable for mm-wave frequencies. A 12 input (beam ports), 12 output (array ports) lens is designed using CST Microwave Studio, and the predicted results are presented.
Resumo:
One of the challenges the tidal power industry faces, is the requirement of cost effective, reliable but highly accurate acquisition of flow data. Different methods are required , applications range over different spatial and temporal scales. This report assembles in the first sections, theoretical background information on acoustic Doppler Velocimetry and RADAR measurements. The use of existing expertise in field tests of marine vehicles is discussed next, followed by a discussion of issues relating to recreating field conditions in laboratory environments. The last three sections present practical applications of various methods performed in field conditions. While progress has been made over the last years, this overview highlights the challenges in full scale field measurements and knowledge gaps in the industry.
Resumo:
X-ray backscatter imaging can be used for a wide range of imaging applications, in particular for industrial inspection and portal security. Currently, the application of this imaging technique to the detection of landmines is limited due to the surrounding sand or soil strongly attenuating the 10s to 100s of keV X-rays required for backscatter imaging. Here, we introduce a new approach involving a 140 MeV short-pulse (< 100 fs) electron beam generated by laser wakefield acceleration to probe the sample, which produces Bremsstrahlung X-rays within the sample enabling greater depths to be imaged. A variety of detector and scintillator configurations are examined, with the best time response seen from an absorptive coated BaF2 scintillator with a bandpass filter to remove the slow scintillation emission components. An X-ray backscatter image of an array of different density and atomic number items is demonstrated. The use of a compact laser wakefield accelerator to generate the electron source, combined with the rapid development of more compact, efficient and higher repetition rate high power laser systems will make this system feasible for applications in the field.
Resumo:
Prostate cancer (CaP) is the most commonly diagnosed cancer in males. There have been dramatic technical advances in radiotherapy delivery, enabling higher doses of radiotherapy to primary cancer, involved lymph nodes and oligometastases with acceptable normal tissue toxicity. Despite this, many patients relapse following primary radical therapy, and novel treatment approaches are required. Metal nanoparticles are agents that promise to improve diagnostic imaging and image-guided radiotherapy and to selectively enhance radiotherapy effectiveness in CaP. We summarize current radiotherapy treatment approaches for CaP and consider pre-clinical and clinical evidence for metal nanoparticles in this condition.
Prostate cancer (CaP) is the most commonly diagnosed cancer in males and is responsible for more than 10,000 deaths each year in the UK.1 Technical advances in radiotherapy delivery, including image-guided intensity-modulated radiotherapy (IG-IMRT), have enabled the delivery of higher radiation dose to the prostate, which has led to improved biochemical control. Further improvements in cancer imaging during radiotherapy are being developed with the advent of MRI simulators and MRI linear accelerators.2–4
Nanotechnology promises to deliver significant advancements across numerous disciplines.5 The widest scope of applications are from the biomedical field including exogenous gene/drug delivery systems, advanced biosensors, targeted contrast agents for diagnostic applications and as direct therapeutic agents used in combination with existing treatment modalities.6–11 This diversity of application is especially evident within cancer research, with a myriad of experimental anticancer strategies currently under investigation.
This review will focus specifically on the potential of metal-based nanoparticles to augment the efficacy of radiotherapy in CaP, a disease where radiotherapy constitutes a major curative treatment modality.12 Furthermore, we will also address the clinical state of the art for CaP radiotherapy and consider how these treatments could be best combined with nanotherapeutics to improve cancer outcomes.
Resumo:
Key content
- Trainees face many challenges in learning the skill set required to perform laparoscopic surgery.
- The time spent in the operating room has been detrimentally impacted upon since the implementation of the European Working Time Directive. In order to address the deficit, surgical educators have looked to the benefits enjoyed in the aviation and sports industries in using simulation training.
Learning objectives
- To summarise the current understanding of the neuropsychological basis of learning a psychomotor skill.
- To clarify factors that influence the acquisition of these skills.
- To summarise how this information can be used in teaching and assessment of laparoscopic skills.
Ethical issues
- The use of virtual reality simulators may be able to form a part of the aptitude assessment in the selection process, in order to identify trainees with the desired attributes to progress into the training programmes. However, as skill improves with practice, is it ethical to exclude novices with poor initial performance assessment before allowing them the opportunities to improve?
Resumo:
Field programmable gate array (FPGA) technology is a powerful platform for implementing computationally complex, digital signal processing (DSP) systems. Applications that are multi-modal, however, are designed for worse case conditions. In this paper, genetic sequencing techniques are applied to give a more sophisticated decomposition of the algorithmic variations, thus allowing an unified hardware architecture which gives a 10-25% area saving and 15% power saving for a digital radar receiver.
Resumo:
This study applies spatial statistical techniques including cokriging to integrate airborne geophysical (radiometric) data with ground-based measurements of peat depth and soil organic carbon (SOC) to monitor change in peat cover for carbon stock calculations. The research is part of the EU funded Tellus Border project and is supported by the INTERREG IVA development programme of the European Regional Development Fund, which is managed by the Special EU Programmes Body (SEUPB). The premise is that saturated peat attenuates the radiometric signal from underlying soils and rocks. Contemporaneous ground-based measurements were collected to corroborate mapped estimates and develop a statistical model for volumetric carbon content (VCC) to 0.5 metres. Field measurements included ground penetrating radar, gamma ray spectrometry and a soil sampling methodology which measured bulk density and soil moisture to determine VCC. One aim of the study was to explore whether airborne radiometric survey data can be used to establish VCC across a region. To account for the footprint of airborne radiometric data, five cores were obtained at each soil sampling location: one at the centre of the ground radiometric equivalent sample location and one at each of the four corners 20 metres apart. This soil sampling strategy replicated the methodology deployed for the Tellus Border geochemistry survey. Two key issues will be discussed from this work. The first addresses the integration of different sampling supports for airborne and ground measured data and the second discusses the compositional nature of the VOC data.
Resumo:
This paper presents an approach to COLREGs compliant ship navigation. A system architecture is proposed, which will be implemented and tested on two platforms: networked bridge simulators and at sea trials using an autonomous unmanned surface vessel. Attention is paid to collision avoidance software and its risk mitigation.
Resumo:
Apesar das recentes inovações tecnológicas, o setor dos transportes continua a exercer impactes significativos sobre a economia e o ambiente. Com efeito, o sucesso na redução das emissões neste setor tem sido inferior ao desejável. Isto deve-se a diferentes fatores como a dispersão urbana e a existência de diversos obstáculos à penetração no mercado de tecnologias mais limpas. Consequentemente, a estratégia “Europa 2020” evidencia a necessidade de melhorar a eficiência no uso das atuais infraestruturas rodoviárias. Neste contexto, surge como principal objetivo deste trabalho, a melhoria da compreensão de como uma escolha de rota adequada pode contribuir para a redução de emissões sob diferentes circunstâncias espaciais e temporais. Simultaneamente, pretende-se avaliar diferentes estratégias de gestão de tráfego, nomeadamente o seu potencial ao nível do desempenho e da eficiência energética e ambiental. A integração de métodos empíricos e analíticos para avaliação do impacto de diferentes estratégias de otimização de tráfego nas emissões de CO2 e de poluentes locais constitui uma das principais contribuições deste trabalho. Esta tese divide-se em duas componentes principais. A primeira, predominantemente empírica, baseou-se na utilização de veículos equipados com um dispositivo GPS data logger para recolha de dados de dinâmica de circulação necessários ao cálculo de emissões. Foram percorridos aproximadamente 13200 km em várias rotas com escalas e características distintas: área urbana (Aveiro), área metropolitana (Hampton Roads, VA) e um corredor interurbano (Porto-Aveiro). A segunda parte, predominantemente analítica, baseou-se na aplicação de uma plataforma integrada de simulação de tráfego e emissões. Com base nesta plataforma, foram desenvolvidas funções de desempenho associadas a vários segmentos das redes estudadas, que por sua vez foram aplicadas em modelos de alocação de tráfego. Os resultados de ambas as perspetivas demonstraram que o consumo de combustível e emissões podem ser significativamente minimizados através de escolhas apropriadas de rota e sistemas avançados de gestão de tráfego. Empiricamente demonstrou-se que a seleção de uma rota adequada pode contribuir para uma redução significativa de emissões. Foram identificadas reduções potenciais de emissões de CO2 até 25% e de poluentes locais até 60%. Através da aplicação de modelos de tráfego demonstrou-se que é possível reduzir significativamente os custos ambientais relacionados com o tráfego (até 30%), através da alteração da distribuição dos fluxos ao longo de um corredor com quatro rotas alternativas. Contudo, apesar dos resultados positivos relativamente ao potencial para a redução de emissões com base em seleções de rotas adequadas, foram identificadas algumas situações de compromisso e/ou condicionantes que devem ser consideradas em futuros sistemas de eco navegação. Entre essas condicionantes importa salientar que: i) a minimização de diferentes poluentes pode implicar diferentes estratégias de navegação, ii) a minimização da emissão de poluentes, frequentemente envolve a escolha de rotas urbanas (em áreas densamente povoadas), iii) para níveis mais elevados de penetração de dispositivos de eco-navegação, os impactos ambientais em todo o sistema podem ser maiores do que se os condutores fossem orientados por dispositivos tradicionais focados na minimização do tempo de viagem. Com este trabalho demonstrou-se que as estratégias de gestão de tráfego com o intuito da minimização das emissões de CO2 são compatíveis com a minimização do tempo de viagem. Por outro lado, a minimização de poluentes locais pode levar a um aumento considerável do tempo de viagem. No entanto, dada a tendência de redução nos fatores de emissão dos poluentes locais, é expectável que estes objetivos contraditórios tendam a ser minimizados a médio prazo. Afigura-se um elevado potencial de aplicação da metodologia desenvolvida, seja através da utilização de dispositivos móveis, sistemas de comunicação entre infraestruturas e veículos e outros sistemas avançados de gestão de tráfego.
Resumo:
Portugal é um dos países europeus com melhor cobertura espacial e populacional de rede de autoestradas (5º entre os 27 da UE). O acentuado crescimento desta rede nos últimos anos leva a que seja necessária a utilização de metodologias de análise e avaliação da qualidade do serviço que é prestado nestas infraestruturas, relativamente às condições de circulação. Usualmente, a avaliação da qualidade de serviço é efetuada por intermédio de metodologias internacionalmente aceites, das quais se destaca a preconizada no Highway Capacity Manual (HCM). É com esta última metodologia que são habitualmente determinados em Portugal, os níveis de serviço nas diversas componentes de uma autoestrada (secções correntes, ramos de ligação e segmentos de entrecruzamento). No entanto, a sua transposição direta para a realidade portuguesa levanta algumas reservas, uma vez que os elementos que compõem o ambiente rodoviário (infraestrutura, veículo e condutor) são distintos dos da realidade norte-americana para a qual foi desenvolvida. Assim, seria útil para os atores envolvidos no setor rodoviário dispor de metodologias desenvolvidas para as condições portuguesas, que possibilitassem uma caracterização mais realista da qualidade de serviço ao nível da operação em autoestradas. No entanto, importa referir que o desenvolvimento de metodologias deste género requer uma quantidade muito significativa de dados geométricos e de tráfego, o que acarreta uma enorme necessidade de meios, quer humanos, quer materiais. Esta abordagem é assim de difícil execução, sendo por isso necessário recorrer a metodologias alternativas para a persecução deste objetivo. Ultimamente tem-se verificado o uso cada vez mais generalizado de modelos de simulação microscópica de tráfego, que simulando o movimento individual dos veículos num ambiente virtual permitem realizar análises de tráfego. A presente dissertação apresenta os resultados obtidos no desenvolvimento de uma metodologia que procura recriar, através de simuladores microscópicos de tráfego, o comportamento das correntes de tráfego em secções correntes de autoestradas com o intuito de, posteriormente, se proceder à adaptação da metodologia preconizada no HCM (na sua edição de 2000) à realidade portuguesa. Para tal, com os simuladores microscópicos utilizados (AIMSUN e VISSIM) procurou-se reproduzir as condições de circulação numa autoestrada portuguesa, de modo a que fosse possível analisar as alterações sofridas no comportamento das correntes de tráfego após a modificação dos principais fatores geométricos e de tráfego envolvidos na metodologia do HCM 2000. Para o efeito, realizou-se uma análise de sensibilidade aos simuladores de forma a avaliar a sua capacidade para representar a influência desses fatores, com vista a, numa fase posterior, se quantificar o seu efeito para a realidade nacional e dessa forma se proceder à adequação da referida metodologia ao contexto português. Em resumo, o presente trabalho apresenta as principais vantagens e limitações dos microssimuladores AIMSUN e VISSIM na modelação do tráfego de uma autoestrada portuguesa, tendo-se concluído que estes simuladores não são capazes de representar de forma explícita alguns dos fatores considerados na metodologia do HCM 2000, o que impossibilita a sua utilização como ferramenta de quantificação dos seus efeitos e consequentemente inviabiliza a adaptação dessa metodologia à realidade nacional. São, no entanto, referidas algumas indicações de como essas limitações poderão vir a ser ultrapassadas, com vista à consecução futura dessa adequação.