931 resultados para large scale linear system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A technique for optimizing the efficiency of the sub-map method for large-scale simultaneous localization and mapping (SLAM) is proposed. It optimizes the benefits of the sub-map technique to improve the accuracy and consistency of an extended Kalman filter (EKF)-based SLAM. Error models were developed and engaged to investigate some of the outstanding issues in employing the sub-map technique in SLAM. Such issues include the size (distance) of an optimal sub-map, the acceptable error effect caused by the process noise covariance on the predictions and estimations made within a sub-map, when to terminate an existing sub-map and start a new one and the magnitude of the process noise covariance that could produce such an effect. Numerical results obtained from the study and an error-correcting process were engaged to optimize the accuracy and convergence of the Invariant Information Local Sub-map Filter previously proposed. Applying this technique to the EKF-based SLAM algorithm (a) reduces the computational burden of maintaining the global map estimates and (b) simplifies transformation complexities and data association ambiguities usually experienced in fusing sub-maps together. A Monte Carlo analysis of the system is presented as a means of demonstrating the consistency and efficacy of the proposed technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes large scale tests conducted on a novel unglazed solar air collector system. The proposed system, referred to as a back-pass solar collector (BPSC), has on-site installation and aesthetic advantages over conventional unglazed transpired solar collectors (UTSC) as it is fully integrated within a standard insulated wall panel. This paper presents the results obtained from monitoring a BPSC wall panel over one year. Measurements of temperature, wind velocity and solar irradiance were taken at multiple air mass flow rates. It is shown that the length of the collector cavities has a direct impact on the efficiency of the system. It is also shown that beyond a height-to-flow ratio of 0.023m/m<sup>3</sup>/hr/m<sup>2</sup>, no additional heat output is obtained by increasing the collector height for the experimental setup in this study, but these numbers would obviously be different if the experimental setup or test environment (e.g. location and climate) change. An equation for predicting the temperature rise of the BPSC is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recommending users for a new social network user to follow is a topic of interest at present. The existing approaches rely on using various types of information about the new user to determine recommended users who have similar interests to the new user. However, this presents a problem when a new user joins a social network, who is yet to have any interaction on the social network. In this paper we present a particular type of conversational recommendation approach, critiquing-based recommendation, to solve the cold start problem. We present a critiquing-based recommendation system, called CSFinder, to recommend users for a new user to follow. A traditional critiquing-based recommendation system allows a user to critique a feature of a recommended item at a time and gradually leads the user to the target recommendation. However this may require a lengthy recommendation session. CSFinder aims to reduce the session length by taking a case-based reasoning approach. It selects relevant recommendation sessions of past users that match the recommendation session of the current user to shortcut the current recommendation session. It selects relevant recommendation sessions from a case base that contains the successful recommendation sessions of past users. A past recommendation session can be selected if it contains recommended items and critiques that sufficiently overlap with the ones in the current session. Our experimental results show that CSFinder has significantly shorter sessions than the ones of an Incremental Critiquing system, which is a baseline critiquing-based recommendation system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The potential of IR absorption and Raman spectroscopy for rapid identification of novel psychoactive substances (NPS) has been tested using a set of 221 unsorted seized samples suspected of containing NPS. Both IR and Raman spectra showed large variation between the different sub-classifications of NPS and smaller, but still distinguishable, differences between closely related compounds within the same class. In initial tests, screening the samples using spectral searching against a limited reference library allowed only 41% of the samples to be fully identified. The limiting factor in the identification was the large number of active compounds in the seized samples for which no reference vibrational data were available in the libraries rather than poor spectral quality. Therefore, when 33 of these compounds were independently identified by NMR and mass spectrometry and their spectra used to extend the libraries, the percentage of samples identified by IR and Raman screening alone increased to 76%, with only 7% of samples having no identifiable constituents. This study, which is the largest of its type ever carried out, therefore demonstrates that this approach of detecting non-matching samples and then identifying them using standard analytical methods has considerable potential in NPS screening since it allows rapid identification of the constituents of the majority of street quality samples. Only one complete feedback cycle was carried out in this study but there is clearly the potential to carry out continuous identification/updating when this system is used in operational settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An organisation that had developed a large information system wanted to embark on a programme that would involve large-scale evolution of it. As a precursor to this, it was decided to create a comprehensive architectural description to capture and understand the system’s design. This undertaking faced a number of challenges, including a low general awareness of software modelling and software architecture practices. The approach taken by the software architects tasked with this project included the definition of a simple, very specific, architecture description language. This paper reports our experience of the project and a simple ADL that we created as part of it. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Central obesity is the hallmark of a number of non-inheritable disorders. The advent of imaging techniques such asMRI has allowed for a fast and accurate assessment of body fat content and distribution. However, image analysis continues to be one of the major obstacles to the use of MRI in large-scale studies. In this study we assess the validity of the recently proposed fat–muscle quantitation system (AMRATM Profiler) for the quantification of intra-abdominal adipose tissue (IAAT) and abdominal subcutaneous adipose tissue (ASAT) from abdominal MR images. Abdominal MR images were acquired from 23 volunteers with a broad range of BMIs and analysed using sliceOmatic, the current gold-standard, and the AMRATM Profiler based on a non-rigid image registration of a library of segmented atlases. The results show that there was a highly significant correlation between the fat volumes generated by the two analysis methods, (Pearson correlation r = 0.97, p < 0.001), with the AMRATM Profiler analysis being significantly faster (~3 min) than the conventional sliceOmatic approach (~40 min). There was also excellent agreement between the methods for the quantification of IAAT (AMRA 4.73 ± 1.99 versus sliceOmatic 4.73 ± 1.75 l, p = 0.97). For the AMRATM Profiler analysis, the intra-observer coefficient of variation was 1.6% for IAAT and 1.1% for ASAT, the inter-observer coefficient of variationwas 1.4%for IAAT and 1.2%for ASAT, the intra-observer correlationwas 0.998 for IAAT and 0.999 for ASAT, and the inter-observer correlation was 0.999 for both IAAT and ASAT. These results indicate that precise and accurate measures of body fat content and distribution can be obtained in a fast and reliable form by the AMRATM Profiler, opening up the possibility of large-scale human phenotypic studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collective behaviours can be observed in both natural and man-made systems composed of a large number of elemental subsystems. Typically, each elemental subsystem has its own dynamics but, whenever interaction between individuals occurs, the individual behaviours tend to be relaxed, and collective behaviours emerge. In this paper, the collective behaviour of a large-scale system composed of several coupled elemental particles is analysed. The dynamics of the particles are governed by the same type of equations but having different parameter values and initial conditions. Coupling between particles is based on statistical feedback, which means that each particle is affected by the average behaviour of its neighbours. It is shown that the global system may unveil several types of collective behaviours, corresponding to partial synchronisation, characterised by the existence of several clusters of synchronised subsystems, and global synchronisation between particles, where all the elemental particles synchronise completely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development and testing of a robotic capsule for search and rescue operations at sea. This capsule is able to operate autonomously or remotely controlled, is transported and deployed by a larger USV into a determined disaster area and is used to carry a life raft and inflate it close to survivors in large-scale maritime disasters. The ultimate goal of this development is to endow search and rescue teams with tools that extend their operational capability in scenarios with adverse atmospheric or maritime conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nos últimos anos o aumento exponencial da utilização de dispositivos móveis e serviços disponibilizados na “Cloud” levou a que a forma como os sistemas são desenhados e implementados mudasse, numa perspectiva de tentar alcançar requisitos que até então não eram essenciais. Analisando esta evolução, com o enorme aumento dos dispositivos móveis, como os “smartphones” e “tablets” fez com que o desenho e implementação de sistemas distribuidos fossem ainda mais importantes nesta área, na tentativa de promover sistemas e aplicações que fossem mais flexíveis, robutos, escaláveis e acima de tudo interoperáveis. A menor capacidade de processamento ou armazenamento destes dispositivos tornou essencial o aparecimento e crescimento de tecnologias que prometem solucionar muitos dos problemas identificados. O aparecimento do conceito de Middleware visa solucionar estas lacunas nos sistemas distribuidos mais evoluídos, promovendo uma solução a nível de organização e desenho da arquitetura dos sistemas, ao memo tempo que fornece comunicações extremamente rápidas, seguras e de confiança. Uma arquitetura baseada em Middleware visa dotar os sistemas de um canal de comunicação que fornece uma forte interoperabilidade, escalabilidade, e segurança na troca de mensagens, entre outras vantagens. Nesta tese vários tipos e exemplos de sistemas distribuídos e são descritos e analisados, assim como uma descrição em detalhe de três protocolos (XMPP, AMQP e DDS) de comunicação, sendo dois deles (XMPP e AMQP) utilzados em projecto reais que serão descritos ao longo desta tese. O principal objetivo da escrita desta tese é demonstrar o estudo e o levantamento do estado da arte relativamente ao conceito de Middleware aplicado a sistemas distribuídos de larga escala, provando que a utilização de um Middleware pode facilitar e agilizar o desenho e desenvolvimento de um sistema distribuído e traz enormes vantagens num futuro próximo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As technology advances not only do new standards and programming styles appear but also some of the previously established ones gain relevance. In a new Internet paradigm where interconnection between small devices is key to the development of new businesses and scientific advancement there is the need to find simple solutions that anyone can implement in order to allow ideas to become more than that, ideas. Open-source software is still alive and well, especially in the area of the Internet of Things. This opens windows for many low capital entrepreneurs to experiment with their ideas and actually develop prototypes, which can help identify problems with a project or shine light on possible new features and interactions. As programming becomes more and more popular between people of fields not related to software there is the need for guidance in developing something other than basic algorithms, which is where this thesis comes in: A comprehensive document explaining the challenges and available choices of developing a sensor data and message delivery system, which scales well and implements the delivery of critical messages. Modularity and extensibility were also given much importance, making this an affordable tool for anyone that wants to build a sensor network of the kind.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large scale image mosaicing methods are in great demand among scientists who study different aspects of the seabed, and have been fostered by impressive advances in the capabilities of underwater robots in gathering optical data from the seafloor. Cost and weight constraints mean that lowcost Remotely operated vehicles (ROVs) usually have a very limited number of sensors. When a low-cost robot carries out a seafloor survey using a down-looking camera, it usually follows a predetermined trajectory that provides several non time-consecutive overlapping image pairs. Finding these pairs (a process known as topology estimation) is indispensable to obtaining globally consistent mosaics and accurate trajectory estimates, which are necessary for a global view of the surveyed area, especially when optical sensors are the only data source. This thesis presents a set of consistent methods aimed at creating large area image mosaics from optical data obtained during surveys with low-cost underwater vehicles. First, a global alignment method developed within a Feature-based image mosaicing (FIM) framework, where nonlinear minimisation is substituted by two linear steps, is discussed. Then, a simple four-point mosaic rectifying method is proposed to reduce distortions that might occur due to lens distortions, error accumulation and the difficulties of optical imaging in an underwater medium. The topology estimation problem is addressed by means of an augmented state and extended Kalman filter combined framework, aimed at minimising the total number of matching attempts and simultaneously obtaining the best possible trajectory. Potential image pairs are predicted by taking into account the uncertainty in the trajectory. The contribution of matching an image pair is investigated using information theory principles. Lastly, a different solution to the topology estimation problem is proposed in a bundle adjustment framework. Innovative aspects include the use of fast image similarity criterion combined with a Minimum spanning tree (MST) solution, to obtain a tentative topology. This topology is improved by attempting image matching with the pairs for which there is the most overlap evidence. Unlike previous approaches for large-area mosaicing, our framework is able to deal naturally with cases where time-consecutive images cannot be matched successfully, such as completely unordered sets. Finally, the efficiency of the proposed methods is discussed and a comparison made with other state-of-the-art approaches, using a series of challenging datasets in underwater scenarios

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reanalysis data obtained from data assimilation are increasingly used for diagnostic studies of the general circulation of the atmosphere, for the validation of modelling experiments and for estimating energy and water fluxes between the Earth surface and the atmosphere. Because fluxes are not specifically observed, but determined by the data assimilation system, they are not only influenced by the utilized observations but also by model physics and dynamics and by the assimilation method. In order to better understand the relative importance of humidity observations for the determination of the hydrological cycle, in this paper we describe an assimilation experiment using the ERA40 reanalysis system where all humidity data have been excluded from the observational data base. The surprising result is that the model, driven by the time evolution of wind, temperature and surface pressure, is able to almost completely reconstitute the large-scale hydrological cycle of the control assimilation without the use of any humidity data. In addition, analysis of the individual weather systems in the extratropics and tropics using an objective feature tracking analysis indicates that the humidity data have very little impact on these systems. We include a discussion of these results and possible consequences for the way moisture information is assimilated, as well as the potential consequences for the design of observing systems for climate monitoring. It is further suggested, with support from a simple assimilation study with another model, that model physics and dynamics play a decisive role for the hydrological cycle, stressing the need to better understand these aspects of model parametrization. .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The field site network (FSN) plays a central role in conducting joint research within all Assessing Large-scale Risks for biodiversity with tested Methods (ALARM) modules and provides a mechanism for integrating research on different topics in ALARM on the same site for measuring multiple impacts on biodiversity. The network covers most European climates and biogeographic regions, from Mediterranean through central European and boreal to subarctic. The project links databases with the European-wide field site network FSN, including geographic information system (GIS)-based information to characterise the test location for ALARM researchers for joint on-site research. Maps are provided in a standardised way and merged with other site-specific information. The application of GIS for these field sites and the information management promotes the use of the FSN for research and to disseminate the results. We conclude that ALARM FSN sites together with other research sites in Europe jointly could be used as a future backbone for research proposals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The occurrence of mid-latitude windstorms is related to strong socio-economic effects. For detailed and reliable regional impact studies, large datasets of high-resolution wind fields are required. In this study, a statistical downscaling approach in combination with dynamical downscaling is introduced to derive storm related gust speeds on a high-resolution grid over Europe. Multiple linear regression models are trained using reanalysis data and wind gusts from regional climate model simulations for a sample of 100 top ranking windstorm events. The method is computationally inexpensive and reproduces individual windstorm footprints adequately. Compared to observations, the results for Germany are at least as good as pure dynamical downscaling. This new tool can be easily applied to large ensembles of general circulation model simulations and thus contribute to a better understanding of the regional impact of windstorms based on decadal and climate change projections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The anomalously wet winter of 2010 had a very important impact on the Portuguese hydrological system. Owing to the detrimental effects of reduced precipitation in Portugal on the environmental and socio-economic systems, the 2010 winter was predominantly beneficial by reversing the accumulated precipitation deficits during the previous hydrological years. The recorded anomalously high precipitation amounts have contributed to an overall increase in river runoffs and dam recharges in the 4 major river basins. In synoptic terms, the winter 2010 was characterised by an anomalously strong westerly flow component over the North Atlantic that triggered high precipitation amounts. A dynamically coherent enhancement in the frequencies of mid-latitude cyclones close to Portugal, also accompanied by significant increases in the occurrence of cyclonic, south and south-westerly circulation weather types, are noteworthy. Furthermore, the prevalence of the strong negative phase of the North Atlantic Oscillation (NAO) also emphasises the main dynamical features of the 2010 winter. A comparison of the hydrological and atmospheric conditions between the 2010 winter and the previous 2 anomalously wet winters (1996 and 2001) was also carried out to isolate not only their similarities, but also their contrasting conditions, highlighting the limitations of estimating winter precipitation amounts in Portugal using solely the NAO phase as a predictor.