972 resultados para Virtual Reference Station (VRS)
Resumo:
This paper presents an automatic vision-based system for UUV station keeping. The vehicle is equipped with a down-looking camera, which provides images of the sea-floor. The station keeping system is based on a feature-based motion detection algorithm, which exploits standard correlation and explicit textural analysis to solve the correspondence problem. A visual map of the area surveyed by the vehicle is constructed to increase the flexibility of the system, allowing the vehicle to position itself when it has lost the reference image. The testing platform is the URIS underwater vehicle. Experimental results demonstrating the behavior of the system on a real environment are presented
Resumo:
Using an immersive virtual reality system, we measured the ability of observers to detect the rotation of an object when its movement was yoked to the observer's own translation. Most subjects had a large bias such that a static object appeared to rotate away from them as they moved. Thresholds for detecting target rotation were similar to those for an equivalent speed discrimination task carried out by static observers, suggesting that visual discrimination is the predominant limiting factor in detecting target rotation. Adding a stable visual reference frame almost eliminated the bias. Varying the viewing distance of the target had little effect, consistent with observers underestimating distance walked. However, accuracy of walking to a briefly presented visual target was high and not consistent with an underestimation of distance walked. We discuss implications for theories of a task-independent representation of visual space. © 2005 Elsevier Ltd. All rights reserved.
Resumo:
It is reported in the literature that distances from the observer are underestimated more in virtual environments (VEs) than in physical world conditions. On the other hand estimation of size in VEs is quite accurate and follows a size-constancy law when rich cues are present. This study investigates how estimation of distance in a CAVETM environment is affected by poor and rich cue conditions, subject experience, and environmental learning when the position of the objects is estimated using an experimental paradigm that exploits size constancy. A group of 18 healthy participants was asked to move a virtual sphere controlled using the wand joystick to the position where they thought a previously-displayed virtual cube (stimulus) had appeared. Real-size physical models of the virtual objects were also presented to the participants as a reference of real physical distance during the trials. An accurate estimation of distance implied that the participants assessed the relative size of sphere and cube correctly. The cube appeared at depths between 0.6 m and 3 m, measured along the depth direction of the CAVE. The task was carried out in two environments: a poor cue one with limited background cues, and a rich cue one with textured background surfaces. It was found that distances were underestimated in both poor and rich cue conditions, with greater underestimation in the poor cue environment. The analysis also indicated that factors such as subject experience and environmental learning were not influential. However, least square fitting of Stevens’ power law indicated a high degree of accuracy during the estimation of object locations. This accuracy was higher than in other studies which were not based on a size-estimation paradigm. Thus as indirect result, this study appears to show that accuracy when estimating egocentric distances may be increased using an experimental method that provides information on the relative size of the objects used.
Resumo:
This paper proposes a solution to the problems associated with network latency within distributed virtual environments. It begins by discussing the advantages and disadvantages of synchronous and asynchronous distributed models, in the areas of user and object representation and user-to-user interaction. By introducing a hybrid solution, which utilises the concept of a causal surface, the advantages of both synchronous and asynchronous models are combined. Object distortion is a characteristic feature of the hybrid system, and this is proposed as a solution which facilitates dynamic real-time user collaboration. The final section covers implementation details, with reference to a prototype system available from the Internet.
Resumo:
In Peer-to-Peer (P2P) networks, it is often desirable to assign node IDs which preserve locality relationships in the underlying topology. Node locality can be embedded into node IDs by utilizing a one dimensional mapping by a Hilbert space filling curve on a vector of network distances from each node to a subset of reference landmark nodes within the network. However this approach is fundamentally limited because while robustness and accuracy might be expected to improve with the number of landmarks, the effectiveness of 1 dimensional Hilbert Curve mapping falls for the curse of dimensionality. This work proposes an approach to solve this issue using Landmark Multidimensional Scaling (LMDS) to reduce a large set of landmarks to a smaller set of virtual landmarks. This smaller set of landmarks has been postulated to represent the intrinsic dimensionality of the network space and therefore a space filling curve applied to these virtual landmarks is expected to produce a better mapping of the node ID space. The proposed approach, the Virtual Landmarks Hilbert Curve (VLHC), is particularly suitable for decentralised systems like P2P networks. In the experimental simulations the effectiveness of the methods is measured by means of the locality preservation derived from node IDs in terms of latency to nearest neighbours. A variety of realistic network topologies are simulated and this work provides strong evidence to suggest that VLHC performs better than either Hilbert Curves or LMDS use independently of each other.
Resumo:
The robot control problem is discussed with regard to controller implementation on a multitransputer array. Some high-performance aspects required of such controllers are described, with particular reference to robot force control. The implications for the architecture required for controllers based on computed torque are discussed and an example is described. The idea of treating a transputer array as a virtual bus is put forward for the implementation of fast real-time controllers. An example is given of controlling a Puma 560 industrial robot. Some of the practical considerations for using transputers for such control are described.
Resumo:
This paper proposes a solution to the problems associated with network latency within distributed virtual environments. It begins by discussing the advantages and disadvantages of synchronous and asynchronous distributed models, in the areas of user and object representation and user-to-user interaction. By introducing a hybrid solution, which utilises the concept of a causal surface, the advantages of both synchronous and asynchronous models are combined. Object distortion is a characteristic feature of the hybrid system, and this is proposed as a solution which facilitates dynamic real-time user collaboration. The final section covers implementation details, with reference to a prototype system available from the Internet.
Resumo:
The phonographic market has experienced a period of significant change caused by technological evolution. This phenomenon of global proportions has been the subject of considerable debate in the media and in academia. The entry of new actors, as well as piracy and new forms of commercialization of musical products, has significantly altered the relationships of power existing in this field. Therefore, the scope of this article is to analyze the phonographic industry in Brazil as an arena of forces and power struggles, based on notions of the field of cultural production, from the perspective of Bourdieu. This study constitutes a qualitative research and data was analyzed using a descriptiveinterpretative approach. In the case of the sector under scrutiny, and on the basis of the theoretical reference material, it would appear to be correct to affirm that economic capital is what is being sought on the part of the actors who comprise the field. Nevertheless, it is important to stress that the critical incidents that brought about the changes in the structure of the field over the course of time were predominantly of a technological nature. The field of the Brazilian phonographic market is currently experiencing a period of structural alteration that was especially affected by the development of MP3 technology and the emergence of virtual piracy. The fact is that formerly the major recording companies dominated the market and had the necessary resources of power to exercise their role as dominant actors and maintain this position. However, the aforementioned factors favored the entry of new actors in the field and the empowerment of those that prior to this time did not have the resources to compete against this domination.
Resumo:
(Triagem de plantas nativas do Brasil para atividades antimicrobiana e de Danos no DNA I. Mata Atlântica . Estação Ecológica Juréia-Itatins). Oitenta e oito espécies nativas do estado de São Paulo foram coletadas numa região de Mata Atlântica e ensaiadas quanto a sua atividade antimicrobiana e capacidade de causar danos no DNA. Dos 114 extratos submetidos aos ensaios para atividade antibacteriana, apenas os extratos de folhas e galhos de Aspidosperma ramiflorum (Apocynaceae) apresentaram uma atividade fraca contra Escherichia coli. No ensaio antifúngico com Candida albicans, não foram observados extratos ativos. Por outro lado, no ensaio de bioautografia com Cladosporium sphaerospermum e C. cladosporioides 12% dos extratos apresentaram atividade. Contudo, nesse ensaio, somente o extrato dos ramos de Psychotria mapoureoides (Rubiaceae) inibiu fortemente o crescimento de ambas espécies do fungo. O ensaio para danos no DNA com cepas mutantes de Saccharomyces cerevisiae apresentou 17.5 % de extratos ativos. A maioria dos extratos ativos (55 %) apresentou resultados seletivos para danos dependentes da topoisomerase II como mecanismo de reparo do DNA e somente 20 % foram seletivos para o mecanismo da topoisomerase I.
Resumo:
One approach to verify the adequacy of estimation methods of reference evapotranspiration is the comparison with the Penman-Monteith method, recommended by the United Nations of Food and Agriculture Organization - FAO, as the standard method for estimating ET0. This study aimed to compare methods for estimating ET0, Makkink (MK), Hargreaves (HG) and Solar Radiation (RS), with Penman-Monteith (PM). For this purpose, we used daily data of global solar radiation, air temperature, relative humidity and wind speed for the year 2010, obtained through the automatic meteorological station, with latitude 18° 91' 66 S, longitude 48° 25' 05 W and altitude of 869m, at the National Institute of Meteorology situated in the Campus of Federal University of Uberlandia - MG, Brazil. Analysis of results for the period were carried out in daily basis, using regression analysis and considering the linear model y = ax, where the dependent variable was the method of Penman-Monteith and the independent, the estimation of ET0 by evaluated methods. Methodology was used to check the influence of standard deviation of daily ET0 in comparison of methods. The evaluation indicated that methods of Solar Radiation and Penman-Monteith cannot be compared, yet the method of Hargreaves indicates the most efficient adjustment to estimate ETo.
Resumo:
This report differs from previous reports in two respects: it covers experimental work up to January 1, 1935, and it includes brief abstracts of publications since the last report. Previously most of the report dealt with work done before the end of the fiscal year; that is, work done between June 30 and January 1 was not reported until over a year later, for the most part. The present report corrects that defect, and in addition the abstracts of publications will make the report useful as a reference guide to published matter. The projects are discussed under subject headings and in addition to the abstracts, brief reports of progress in projects under way are included. Complete data for these projects are not included; rather an attempt has been made to show how far the work has gone and to indicate some of the directions or trends of the work. The drouth of the past summer reduced yields severely. As a result the collection of significant data on yields was almost impossible. A few of the Experiment Station workers have ben loaned to federal projects. Despite these handicaps many projects have been advanced and many have been completed.
Resumo:
The study of Antarctic archaeal communities adds information on the biogeography of this group and helps understanding the dynamics of biogenic methane production in such extreme habitats. Molecular methods were combined to methane flux determinations in Martel Inlet, Admiralty Bay, to assess archaeal diversity, to obtain information about contribution of the area to atmospheric methane budget and to detect possible interferences of the Antarctic Brazilian Station Comandante Ferraz (EACF) wastewater discharge on local archaeal communities and methane emissions. Methane fluxes in Martel Inlet ranged from 3.2 to 117.9 mu mol CH(4) m(-2) d(-1), with an average of 51.3 +/- 8.5 mu mol CH(4) m(-2) d(-1) and a median of 57.6 mu mol CH(4) m(-2)d(-1). However, three negative fluxes averaging -11.3 mu mol CH(4) m(-2) d(-1) were detected in MacKellar Inlet, indicating that Admiralty Bay can be either a source or sink of atmospheric methane. Denaturing gradient gel electrophoresis (DGGE) showed that archaeal communities at EACF varied with depth and formed a group separated from the reference sites. Granulometric analysis indicated that differences observed may be mostly related to sediment type. However, an influence of wastewater input could not be discarded, since higher methane fluxes were found at CF site. suggesting stimulation of local methanogenesis. DGGE profile of the wastewater sample grouped separated from all other samples, suggesting that methanogenesis stimulation may be due to changes in environmental conditions rather than to the input of allochtonous species from the wastewater. 16S ribosomal DNA clone libraries analysis showed that all wastewater sequences were related to known methanogenic groups belonging to the hydrogenotrophic genera Methanobacterium and Methanobrevibacter and the aceticlastic genus Methanosaeta. EACF and Botany Point sediment clone libraries retrieved only groups of uncultivated Archaea, with predominance of Crenarchaeota representatives (MCG, MG1, MBG-B, MBG-C and MHVG groups). Euryarchaeota sequences found were mostly related to the LDS and RC-V groups, but MBG-D and DHVE-5 were also present. No representatives of cultivated methanogenic groups were found, but coverage estimates suggest that a higher number of clones would have to be analyzed in order to cover the greater archaeal diversity of Martel Inlet sediment. Nevertheless, the analysis of the libraries revealed groups not commonly found by other authors in Antarctic habitats and also indicated the presence of groups of uncultivated archaea previously associated to methane rich environments or to the methane cycle. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This thesis is mainly concerned with a model calculation for generalized parton distributions (GPDs). We calculate vectorial- and axial GPDs for the N N and N Delta transition in the framework of a light front quark model. This requires the elaboration of a connection between transition amplitudes and GPDs. We provide the first quark model calculations for N Delta GPDs. The examination of transition amplitudes leads to various model independent consistency relations. These relations are not exactly obeyed by our model calculation since the use of the impulse approximation in the light front quark model leads to a violation of Poincare covariance. We explore the impact of this covariance breaking on the GPDs and form factors which we determine in our model calculation and find large effects. The reference frame dependence of our results which originates from the breaking of Poincare covariance can be eliminated by introducing spurious covariants. We extend this formalism in order to obtain frame independent results from our transition amplitudes.
Resumo:
Beamforming entails joint processing of multiple signals received or transmitted by an array of antennas. This thesis addresses the implementation of beamforming in two distinct systems, namely a distributed network of independent sensors, and a broad-band multi-beam satellite network. With the rising popularity of wireless sensors, scientists are taking advantage of the flexibility of these devices, which come with very low implementation costs. Simplicity, however, is intertwined with scarce power resources, which must be carefully rationed to ensure successful measurement campaigns throughout the whole duration of the application. In this scenario, distributed beamforming is a cooperative communication technique, which allows nodes in the network to emulate a virtual antenna array seeking power gains in the order of the size of the network itself, when required to deliver a common message signal to the receiver. To achieve a desired beamforming configuration, however, all nodes in the network must agree upon the same phase reference, which is challenging in a distributed set-up where all devices are independent. The first part of this thesis presents new algorithms for phase alignment, which prove to be more energy efficient than existing solutions. With the ever-growing demand for broad-band connectivity, satellite systems have the great potential to guarantee service where terrestrial systems can not penetrate. In order to satisfy the constantly increasing demand for throughput, satellites are equipped with multi-fed reflector antennas to resolve spatially separated signals. However, incrementing the number of feeds on the payload corresponds to burdening the link between the satellite and the gateway with an extensive amount of signaling, and to possibly calling for much more expensive multiple-gateway infrastructures. This thesis focuses on an on-board non-adaptive signal processing scheme denoted as Coarse Beamforming, whose objective is to reduce the communication load on the link between the ground station and space segment.
Resumo:
Purpose:To determine the potential of minimally invasive postmortem computed tomographic (CT) angiography combined with image-guided tissue biopsy of the myocardium and lungs in decedents who were thought to have died of acute chest disease and to compare this method with conventional autopsy as the reference standard.Materials and Methods:The responsible justice department and ethics committee approved this study. Twenty corpses (four female corpses and 16 male corpses; age range, 15-80 years), all of whom were reported to have had antemortem acute chest pain, were imaged with postmortem whole-body CT angiography and underwent standardized image-guided biopsy. The standard included three biopsies of the myocardium and a single biopsy of bilateral central lung tissue. Additional biopsies of pulmonary clots for differentiation of pulmonary embolism and postmortem organized thrombus were performed after initial analysis of the cross-sectional images. Subsequent traditional autopsy with sampling of histologic specimens was performed in all cases. Thereafter, conventional histologic and autopsy reports were compared with postmortem CT angiography and CT-guided biopsy findings. A Cohen k coefficient analysis was performed to explore the effect of the clustered nature of the data.Results:In 19 of the 20 cadavers, findings at postmortem CT angiography in combination with CT-guided biopsy validated the cause of death found at traditional autopsy. In one cadaver, early myocardial infarction of the papillary muscles had been missed. The Cohen κ coefficient was 0.94. There were four instances of pulmonary embolism, three aortic dissections (Stanford type A), three myocardial infarctions, three instances of fresh coronary thrombosis, three cases of obstructive coronary artery disease, one ruptured ulcer of the ascending aorta, one ruptured aneurysm of the right subclavian artery, one case of myocarditis, and one pulmonary malignancy with pulmonary artery erosion. In seven of 20 cadavers, CT-guided biopsy provided additional histopathologic information that substantiated the final diagnosis of the cause of death.Conclusion:Postmortem CT angiography combined with image-guided biopsy, because of their minimally invasive nature, have a potential role in the detection of the cause of death after acute chest pain.© RSNA, 2012.