6 resultados para Multi-soft sets

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unmanned surface vehicles (USVs) are able to accomplish difficult and challenging tasks both in civilian and defence sectors without endangering human lives. Their ability to work round the clock makes them well-suited for matters that demand immediate attention. These issues include but not limited to mines countermeasures, measuring the extent of an oil spill and locating the source of a chemical discharge. A number of USV programmes have emerged in the last decade for a variety of aforementioned purposes. Springer USV is one such research project highlighted in this paper. The intention herein is to report results emanating from data acquired from experiments on the Springer vessel whilst testing its advanced navigation, guidance and control (NGC) subsystems. The algorithms developed for these systems are based on soft-computing methodologies. A novel form of data fusion navigation algorithm has been developed and integrated with a modified optimal controller. Experimental results are presented and analysed for various scenarios including single and multiple waypoints tracking and fixed and time-varying reference bearings. It is demonstrated that the proposed NGC system provides promising results despite the presence of modelling uncertainty and external disturbances.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate conceptual models of groundwater systems are essential for correct interpretation of monitoring data in catchment studies. In surface-water dominated hard rock regions, modern ground and surface water monitoring programmes often have very high resolution chemical, meteorological and hydrological observations but lack an equivalent emphasis on the subsurface environment, the properties of which exert a strong control on flow pathways and interactions with surface waters. The reasons for this disparity are the complexity of the system and the difficulty in accurately characterising the subsurface, except locally at outcrops or in boreholes. This is particularly the case in maritime north-western Europe, where a legacy of glacial activity, combined with large areas underlain by heterogeneous igneous and metamorphic bedrock, make the structure and weathering of bedrock difficult to map or model. Traditional approaches which seek to extrapolate information from borehole to field-scale are of limited application in these environments due to the high degree of spatial heterogeneity. Here we apply an integrative and multi-scale approach, optimising and combining standard geophysical techniques to generate a three-dimensional geological conceptual model of the subsurface in a catchment in NE Ireland. Available airborne LiDAR, electromagnetic and magnetic data sets were analysed for the region. At field-scale surface geophysical methods, including electrical resistivity tomography, seismic refraction, ground penetrating radar and magnetic surveys, were used and combined with field mapping of outcrops and borehole testing. The study demonstrates how combined interpretation of multiple methods at a range of scales produces robust three-dimensional conceptual models and a stronger basis for interpreting groundwater and surface water monitoring data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With security and surveillance, there is an increasing need to process image data efficiently and effectively either at source or in a large data network. Whilst a Field-Programmable Gate Array (FPGA) has been seen as a key technology for enabling this, the design process has been viewed as problematic in terms of the time and effort needed for implementation and verification. The work here proposes a different approach of using optimized FPGA-based soft-core processors which allows the user to exploit the task and data level parallelism to achieve the quality of dedicated FPGA implementations whilst reducing design time. The paper also reports some preliminary
progress on the design flow to program the structure. An implementation for a Histogram of Gradients algorithm is also reported which shows that a performance of 328 fps can be achieved with this design approach, whilst avoiding the long design time, verification and debugging steps associated with conventional FPGA implementations.