970 resultados para synchroton-based techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Darrerament, l'interès pel desenvolupament d'aplicacions amb robots submarins autònoms (AUV) ha crescut de forma considerable. Els AUVs són atractius gràcies al seu tamany i el fet que no necessiten un operador humà per pilotar-los. Tot i això, és impossible comparar, en termes d'eficiència i flexibilitat, l'habilitat d'un pilot humà amb les escasses capacitats operatives que ofereixen els AUVs actuals. L'utilització de AUVs per cobrir grans àrees implica resoldre problemes complexos, especialment si es desitja que el nostre robot reaccioni en temps real a canvis sobtats en les condicions de treball. Per aquestes raons, el desenvolupament de sistemes de control autònom amb l'objectiu de millorar aquestes capacitats ha esdevingut una prioritat. Aquesta tesi tracta sobre el problema de la presa de decisions utilizant AUVs. El treball presentat es centra en l'estudi, disseny i aplicació de comportaments per a AUVs utilitzant tècniques d'aprenentatge per reforç (RL). La contribució principal d'aquesta tesi consisteix en l'aplicació de diverses tècniques de RL per tal de millorar l'autonomia dels robots submarins, amb l'objectiu final de demostrar la viabilitat d'aquests algoritmes per aprendre tasques submarines autònomes en temps real. En RL, el robot intenta maximitzar un reforç escalar obtingut com a conseqüència de la seva interacció amb l'entorn. L'objectiu és trobar una política òptima que relaciona tots els estats possibles amb les accions a executar per a cada estat que maximitzen la suma de reforços totals. Així, aquesta tesi investiga principalment dues tipologies d'algoritmes basats en RL: mètodes basats en funcions de valor (VF) i mètodes basats en el gradient (PG). Els resultats experimentals finals mostren el robot submarí Ictineu en una tasca autònoma real de seguiment de cables submarins. Per portar-la a terme, s'ha dissenyat un algoritme anomenat mètode d'Actor i Crític (AC), fruit de la fusió de mètodes VF amb tècniques de PG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical load forecasting plays a vital role in order to achieve the concept of next generation power system such as smart grid, efficient energy management and better power system planning. As a result, high forecast accuracy is required for multiple time horizons that are associated with regulation, dispatching, scheduling and unit commitment of power grid. Artificial Intelligence (AI) based techniques are being developed and deployed worldwide in on Varity of applications, because of its superior capability to handle the complex input and output relationship. This paper provides the comprehensive and systematic literature review of Artificial Intelligence based short term load forecasting techniques. The major objective of this study is to review, identify, evaluate and analyze the performance of Artificial Intelligence (AI) based load forecast models and research gaps. The accuracy of ANN based forecast model is found to be dependent on number of parameters such as forecast model architecture, input combination, activation functions and training algorithm of the network and other exogenous variables affecting on forecast model inputs. Published literature presented in this paper show the potential of AI techniques for effective load forecasting in order to achieve the concept of smart grid and buildings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image-based Relighting (IBRL) has recently attracted a lot of research interest for its ability to relight real objects or scenes, from novel illuminations captured in natural/synthetic environments. Complex lighting effects such as subsurface scattering, interreflection, shadowing, mesostructural self-occlusion, refraction and other relevant phenomena can be generated using IBRL. The main advantage of image-based graphics is that the rendering time is independent of scene complexity as the rendering is actually a process of manipulating image pixels, instead of simulating light transport. The goal of this paper is to provide a complete and systematic overview of the research in Imagebased Relighting. We observe that essentially all IBRL techniques can be broadly classified into three categories (Fig. 9), based on how the scene/illumination information is captured: Reflectance function-based, Basis function-based and Plenoptic function-based. We discuss the characteristics of each of these categories and their representative methods. We also discuss about the sampling density and types of light source(s), relevant issues of IBRL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the considerable recent attention to distributed power generation and interest in sustainable energy, the integration of photovoltaic (PV) systems to grid-connected or isolated microgrids has become widespread. In order to maximize power output of PV system extensive research into control strategies for maximum power point tracking (MPPT) methods has been conducted. According to the robust, reliable, and fast performance of artificial intelligence-based MPPT methods, these approaches have been applied recently to various systems under different conditions. Given the diversity of recent advances to MPPT approaches a review focusing on the performance and reliability of these methods under diverse conditions is required. This paper reviews AI-based techniques proven to be effective and feasible to implement and very common in literature for MPPT, including their limitations and advantages. In order to support researchers in application of the reviewed techniques this study is not limited to reviewing the performance of recently adopted methods, rather discusses the background theory, application to MPPT systems, and important references relating to each method. It is envisioned that this review can be a valuable resource for researchers and engineers working with PV-based power systems to be able to access the basic theory behind each method, select the appropriate method according to project requirements, and implement MPPT systems to fulfill project objectives.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Eigen-based techniques and other monolithic approaches to face recognition have long been a cornerstone in the face recognition community due to the high dimensionality of face images. Eigen-face techniques provide minimal reconstruction error and limit high-frequency content while linear discriminant-based techniques (fisher-faces) allow the construction of subspaces which preserve discriminatory information. This paper presents a frequency decomposition approach for improved face recognition performance utilising three well-known techniques: Wavelets; Gabor / Log-Gabor; and the Discrete Cosine Transform. Experimentation illustrates that frequency domain partitioning prior to dimensionality reduction increases the information available for classification and greatly increases face recognition performance for both eigen-face and fisher-face approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Learning and then recognizing a route, whether travelled during the day or at night, in clear or inclement weather, and in summer or winter is a challenging task for state of the art algorithms in computer vision and robotics. In this paper, we present a new approach to visual navigation under changing conditions dubbed SeqSLAM. Instead of calculating the single location most likely given a current image, our approach calculates the best candidate matching location within every local navigation sequence. Localization is then achieved by recognizing coherent sequences of these “local best matches”. This approach removes the need for global matching performance by the vision front-end - instead it must only pick the best match within any short sequence of images. The approach is applicable over environment changes that render traditional feature-based techniques ineffective. Using two car-mounted camera datasets we demonstrate the effectiveness of the algorithm and compare it to one of the most successful feature-based SLAM algorithms, FAB-MAP. The perceptual change in the datasets is extreme; repeated traverses through environments during the day and then in the middle of the night, at times separated by months or years and in opposite seasons, and in clear weather and extremely heavy rain. While the feature-based method fails, the sequence-based algorithm is able to match trajectory segments at 100% precision with recall rates of up to 60%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the explosive growth of resources available through the Internet, information mismatching and overload have become a severe concern to users. Web users are commonly overwhelmed by huge volume of information and are faced with the challenge of finding the most relevant and reliable information in a timely manner. Personalised information gathering and recommender systems represent state-of-the-art tools for efficient selection of the most relevant and reliable information resources, and the interest in such systems has increased dramatically over the last few years. However, web personalization has not yet been well-exploited; difficulties arise while selecting resources through recommender systems from a technological and social perspective. Aiming to promote high quality research in order to overcome these challenges, this paper provides a comprehensive survey on the recent work and achievements in the areas of personalised web information gathering and recommender systems. The report covers concept-based techniques exploited in personalised information gathering and recommender systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Currently, recommender systems (RS) have been widely applied in many commercial e-commerce sites to help users deal with the information overload problem. Recommender systems provide personalized recommendations to users and, thus, help in making good decisions about which product to buy from the vast amount of product choices. Many of the current recommender systems are developed for simple and frequently purchased products like books and videos, by using collaborative-filtering and content-based approaches. These approaches are not directly applicable for recommending infrequently purchased products such as cars and houses as it is difficult to collect a large number of ratings data from users for such products. Many of the ecommerce sites for infrequently purchased products are still using basic search-based techniques whereby the products that match with the attributes given in the target user’s query are retrieved and recommended. However, search-based recommenders cannot provide personalized recommendations. For different users, the recommendations will be the same if they provide the same query regardless of any difference in their interest. In this article, a simple user profiling approach is proposed to generate user’s preferences to product attributes (i.e., user profiles) based on user product click stream data. The user profiles can be used to find similarminded users (i.e., neighbours) accurately. Two recommendation approaches are proposed, namely Round- Robin fusion algorithm (CFRRobin) and Collaborative Filtering-based Aggregated Query algorithm (CFAgQuery), to generate personalized recommendations based on the user profiles. Instead of using the target user’s query to search for products as normal search based systems do, the CFRRobin technique uses the attributes of the products in which the target user’s neighbours have shown interest as queries to retrieve relevant products, and then recommends to the target user a list of products by merging and ranking the returned products using the Round Robin method. The CFAgQuery technique uses the attributes of the products that the user’s neighbours have shown interest in to derive an aggregated query, which is then used to retrieve products to recommend to the target user. Experiments conducted on a real e-commerce dataset show that both the proposed techniques CFRRobin and CFAgQuery perform better than the standard Collaborative Filtering and the Basic Search approaches, which are widely applied by the current e-commerce applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The coupling of kurtosis based-indexes and envelope analysis represents one of the most successful and widespread procedures for the diagnostics of incipient faults on rolling element bearings. Kurtosis-based indexes are often used to select the proper demodulation band for the application of envelope-based techniques. Kurtosis itself, in slightly different formulations, is applied for the prognostic and condition monitoring of rolling element bearings, as a standalone tool for a fast indication of the development of faults. This paper shows for the first time the strong analytical connection which holds for these two families of indexes. In particular, analytical identities are shown for the squared envelope spectrum (SES) and the kurtosis of the corresponding band-pass filtered analytic signal. In particular, it is demonstrated how the sum of the peaks in the SES corresponds to the raw 4th order moment. The analytical results show as well a link with an another signal processing technique: the cepstrum pre-whitening, recently used in bearing diagnostics. The analytical results are the basis for the discussion on an optimal indicator for the choice of the demodulation band, the ratio of cyclic content (RCC), which endows the kurtosis with selectivity in the cyclic frequency domain and whose performance is compared with more traditional kurtosis-based indicators such as the protrugram. A benchmark, performed on numerical simulations and experimental data coming from two different test-rigs, proves the superior effectiveness of such an indicator. Finally a short introduction to the potential offered by the newly proposed index in the field of prognostics is given in an additional experimental example. In particular the RCC is tested on experimental data collected on an endurance bearing test-rig, showing its ability to follow the development of the damage with a single numerical index.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. In order to judge on compliance of the business processing, the degree of behavioural deviation of a case, i.e., an observed execution sequence, is quantified with respect to a process model (referred to as fitness, or recall). Recently, different compliance measures have been proposed. Still, nearly all of them are grounded on state-based techniques and the trace equivalence criterion, in particular. As a consequence, these approaches have to deal with the state explosion problem. In this paper, we argue that a behavioural abstraction may be leveraged to measure the compliance of a process log – a collection of cases. To this end, we utilise causal behavioural profiles that capture the behavioural characteristics of process models and cases, and can be computed efficiently. We propose different compliance measures based on these profiles, discuss the impact of noise in process logs on our measures, and show how diagnostic information on non-compliance is derived. As a validation, we report on findings of applying our approach in a case study with an international service provider.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theoretical approaches are of fundamental importance to predict the potential impact of waste disposal facilities on ground water contamination. Appropriate design parameters are generally estimated be fitting theoretical models to data gathered from field monitoring or laboratory experiments. Transient through-diffusion tests are generally conducted in the laboratory to estimate the mass transport parameters of the proposed barrier material. Thes parameters are usually estimated either by approximate eye-fitting calibration or by combining the solution of the direct problem with any available gradient-based techniques. In this work, an automated, gradient-free solver is developed to estimate the mass transport parameters of a transient through-diffusion model. The proposed inverse model uses a particle swarm optimization (PSO) algorithm that is based on the social behavior of animals searching for food sources. The finite difference numerical solution of the forward model is integrated with the PSO algorithm to solve the inverse problem of parameter estimation. The working principle of the new solver is demonstrated and mass transport parameters are estimated from laboratory through-diffusion experimental data. An inverse model based on the standard gradient-based technique is formulated to compare with the proposed solver. A detailed comparative study is carried out between conventional methods and the proposed solver. The present automated technique is found to be very efficient and robust. The mass transport parameters are obtained with great precision.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Megasphaera cerevisiae, Pectinatus cerevisiiphilus, Pectinatus frisingensis, Selenomonas lacticifex, Zymophilus paucivorans and Zymophilus raffinosivorans are strictly anaerobic Gram-stain-negative bacteria that are able to spoil beer by producing off-flavours and turbidity. They have only been isolated from the beer production chain. The species are phylogenetically affiliated to the Sporomusa sub-branch in the class "Clostridia". Routine cultivation methods for detection of strictly anaerobic bacteria in breweries are time-consuming and do not allow species identification. The main aim of this study was to utilise DNA-based techniques in order to improve detection and identification of the Sporomusa sub-branch beer-spoilage bacteria and to increase understanding of their biodiversity, evolution and natural sources. Practical PCR-based assays were developed for monitoring of M. cerevisiae, Pectinatus species and the group of Sporomusa sub-branch beer spoilers throughout the beer production process. The developed assays reliably differentiated the target bacteria from other brewery-related microbes. The contaminant detection in process samples (10 1,000 cfu/ml) could be accomplished in 2 8 h. Low levels of viable cells in finished beer (≤10 cfu/100 ml) were usually detected after 1 3 d culture enrichment. Time saving compared to cultivation methods was up to 6 d. Based on a polyphasic approach, this study revealed the existence of three new anaerobic spoilage species in the beer production chain, i.e. Megasphaera paucivorans, Megasphaera sueciensis and Pectinatus haikarae. The description of these species enabled establishment of phenotypic and DNA-based methods for their detection and identification. The 16S rRNA gene based phylogenetic analysis of the Sporomusa sub-branch showed that the genus Selenomonas originates from several ancestors and will require reclassification. Moreover, Z. paucivorans and Z. raffinosivorans were found to be in fact members of the genus Propionispira. This relationship implies that they were carried to breweries along with plant material. The brewery-related Megasphaera species formed a distinct sub-group that did not include any sequences from other sources, suggesting that M. cerevisiae, M. paucivorans and M. sueciensis may be uniquely adapted to the brewery ecosystem. M. cerevisiae was also shown to exhibit remarkable resistance against many brewery-related stress conditions. This may partly explain why it is a brewery contaminant. This study showed that DNA-based techniques provide useful tools for obtaining more rapid and specific information about the presence and identity of the strictly anaerobic spoilage bacteria in the beer production chain than is possible using cultivation methods. This should ensure financial benefits to the industry and better product quality to customers. In addition, DNA-based analyses provided new insight into the biodiversity as well as natural sources and relations of the Sporomusa sub-branch bacteria. The data can be exploited for taxonomic classification of these bacteria and for surveillance and control of contaminations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traditional image reconstruction methods in rapid dynamic diffuse optical tomography employ l(2)-norm-based regularization, which is known to remove the high-frequency components in the reconstructed images and make them appear smooth. The contrast recovery in these type of methods is typically dependent on the iterative nature of method employed, where the nonlinear iterative technique is known to perform better in comparison to linear techniques (noniterative) with a caveat that nonlinear techniques are computationally complex. Assuming that there is a linear dependency of solution between successive frames resulted in a linear inverse problem. This new framework with the combination of l(1)-norm based regularization can provide better robustness to noise and provide better contrast recovery compared to conventional l(2)-based techniques. Moreover, it is shown that the proposed l(1)-based technique is computationally efficient compared to its counterpart (l(2)-based one). The proposed framework requires a reasonably close estimate of the actual solution for the initial frame, and any suboptimal estimate leads to erroneous reconstruction results for the subsequent frames.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

These are definitively exciting times for membrane lipid researchers. Once considered just as the cell membrane building blocks, the important role these lipids play is steadily being acknowledged. The improvement occurred in mass spectrometry techniques (MS) allows the establishment of the precise lipid composition of biological extracts. However, to fully understand the biological function of each individual lipid species, we need to know its spatial distribution and dynamics. In the past 10 years, the field has experienced a profound revolution thanks to the development of MS-based techniques allowing lipid imaging (MSI). Images reveal and verify what many lipid researchers had already shown by different means, but none as convincing as an image: each cell type presents a specific lipid composition, which is highly sensitive to its physiological and pathological state. While these techniques will help to place membrane lipids in the position they deserve, they also open the black box containing all the unknown regulatory mechanisms accounting for such tailored lipid composition. Thus, these results urges to different disciplines to redefine their paradigm of study by including the complexity revealed by the MSI techniques.