960 resultados para Map-based Cloning


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign, with the aim of distinguishing the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents that are very valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies might be very helpful in the future to validate dispersion models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MSC Dissertation in Computer Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Evidence Accumulation Clustering (EAC) paradigm is a clustering ensemble method which derives a consensus partition from a collection of base clusterings obtained using different algorithms. It collects from the partitions in the ensemble a set of pairwise observations about the co-occurrence of objects in a same cluster and it uses these co-occurrence statistics to derive a similarity matrix, referred to as co-association matrix. The Probabilistic Evidence Accumulation for Clustering Ensembles (PEACE) algorithm is a principled approach for the extraction of a consensus clustering from the observations encoded in the co-association matrix based on a probabilistic model for the co-association matrix parameterized by the unknown assignments of objects to clusters. In this paper we extend the PEACE algorithm by deriving a consensus solution according to a MAP approach with Dirichlet priors defined for the unknown probabilistic cluster assignments. In particular, we study the positive regularization effect of Dirichlet priors on the final consensus solution with both synthetic and real benchmark data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, reducing energy consumption is one of the highest priorities and biggest challenges faced worldwide and in particular in the industrial sector. Given the increasing trend of consumption and the current economical crisis, identifying cost reductions on the most energy-intensive sectors has become one of the main concerns among companies and researchers. Particularly in industrial environments, energy consumption is affected by several factors, namely production factors(e.g. equipments), human (e.g. operators experience), environmental (e.g. temperature), among others, which influence the way of how energy is used across the plant. Therefore, several approaches for identifying consumption causes have been suggested and discussed. However, the existing methods only provide guidelines for energy consumption and have shown difficulties in explaining certain energy consumption patterns due to the lack of structure to incorporate context influence, hence are not able to track down the causes of consumption to a process level, where optimization measures can actually take place. This dissertation proposes a new approach to tackle this issue, by on-line estimation of context-based energy consumption models, which are able to map operating context to consumption patterns. Context identification is performed by regression tree algorithms. Energy consumption estimation is achieved by means of a multi-model architecture using multiple RLS algorithms, locally estimated for each operating context. Lastly, the proposed approach is applied to a real cement plant grinding circuit. Experimental results prove the viability of the overall system, regarding both automatic context identification and energy consumption estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the first scientific maps of the Amazon region, The Course of the Amazon River (Le Cours de La Rivière des Amazones), was constructed by Nicolas Sanson, a French cartographer of the seventeenth century, and served as the prototype for many others. The evaluation of this chart, until now, has been that it is a very defective map, a sketch based on a historical account, according to the opinion of La Condamine. Thus, the aim of the present work was to prove that the map of the Amazon River traced by Nicolas Sanson is a scientific work, a map that presents precise geographic coordinates considering its time, shows a well-determined prime meridian, and also employs a creative methodology to deduce longitudes from latitudes and distances that had been covered. To show such characteristics, an analysis of the accuracy of the map was made by comparing its latitudes and longitudes with those of a current map. We determined the prime meridian of this map and analyzed the methodology used for the calculation of longitudes. The conclusion is that it is actually a good map for the time, particularly considering the technology and the limited information that Sanson had at his disposal. This proves that the negative assertion of La Condamine is unfounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manganese ferrite nanoparticles with a size distribution of 26 ± 7 nm (from TEM measurements) were synthesized by the coprecipitation method. The obtained nanoparticles exhibit a superparamagnetic behaviour at room temperature with a magnetic squareness of 0.016 and a coercivity field of 6.3 Oe. These nanoparticles were either entrapped in liposomes (aqueous magnetoliposomes, AMLs) or covered with a lipid bilayer, forming solid magnetoliposomes (SMLs). Both types of magnetoliposomes, exhibiting sizes below or around 150 nm, were found to be suitable for biomedical applications. Membrane fusion between magnetoliposomes (both AMLS and SMLs) and GUVs (giant unilamellar vesicles), the latter used as models of cell membranes, was confirmed by F¨orster Resonance Energy Transfer (FRET) assays, using a NBD labeled lipid as the energy donor and Nile Red or rhodamine B-DOPE as the energy acceptor. A potential antitumor thienopyridine derivative was successfully incorporated into both aqueous and solid magnetoliposomes, pointing to a promising application of these systems in oncological therapy, simultaneously as hyperthermia agents and nanocarriers for antitumor drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Publicado em "NanoPT2016 book of abstracts"

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the past few decades, numerous plasmid vectors have been developed for cloning, gene expression analysis, and genetic engineering. Cloning procedures typically rely on PCR amplification, DNA fragment restriction digestion, recovery, and ligation, but increasingly, procedures are being developed to assemble large synthetic DNAs. In this study, we developed a new gene delivery system using the integrase activity of an integrative and conjugative element (ICE). The advantage of the integrase-based delivery is that it can stably introduce a large DNA fragment (at least 75 kb) into one or more specific sites (the gene for glycine-accepting tRNA) on a target chromosome. Integrase recombination activity in Escherichia coli is kept low by using a synthetic hybrid promoter, which, however, is unleashed in the final target host, forcing the integration of the construct. Upon integration, the system is again silenced. Two variants with different genetic features were produced, one in the form of a cloning vector in E. coli and the other as a mini-transposable element by which large DNA constructs assembled in E. coli can be tagged with the integrase gene. We confirmed that the system could successfully introduce cosmid and bacterial artificial chromosome (BAC) DNAs from E. coli into the chromosome of Pseudomonas putida in a site-specific manner. The integrase delivery system works in concert with existing vector systems and could thus be a powerful tool for synthetic constructions of new metabolic pathways in a variety of host bacteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The minimum chromosome number of Glomus intraradices was assessed through cloning and sequencing of the highly divergent telomere-associated sequences (TAS) and by pulsed field gel electrophoresis (PFGE). The telomere of G. intraradices, as in other filamentous fungi, consists of TTAGGG repeats, this was confirmed using Bal31 nuclease time course reactions. Telomere length was estimated to be roughly 0.9 kb by Southern blots on genomic DNA and a telomere probe. We have identified six classes of cloned chromosomal termini based on the TAS. An unusually high genetic variation was observed within two of the six TAS classes. To further assess the total number of chromosome termini, we used telomere fingerprinting. Surprisingly, all hybridization patterns showed smears, which demonstrate that TAS are remarkably variable in the G. intraradices genome. These analyses predict the presence of at least three chromosomes in G. intraradices while PFGE showed a pattern of four bands ranging from 1.2 to 1.5 Mb. Taken together, our results indicate that there are at least four chromosomes in G. intraradices but there are probably more. The information on TAS and telomeres in the G. intradicies will be essential for making a physical map of the G. intraradices genome and could provide molecular markers for future studies of genetic variation among nuclei in these multigenomic fungi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A factor limiting preliminary rockfall hazard mapping at regional scale is often the lack of knowledge of potential source areas. Nowadays, high resolution topographic data (LiDAR) can account for realistic landscape details even at large scale. With such fine-scale morphological variability, quantitative geomorphometric analyses become a relevant approach for delineating potential rockfall instabilities. Using digital elevation model (DEM)-based ?slope families? concept over areas of similar lithology and cliffs and screes zones available from the 1:25,000 topographic map, a susceptibility rockfall hazard map was drawn up in the canton of Vaud, Switzerland, in order to provide a relevant hazard overview. Slope surfaces over morphometrically-defined thresholds angles were considered as rockfall source zones. 3D modelling (CONEFALL) was then applied on each of the estimated source zones in order to assess the maximum runout length. Comparison with known events and other rockfall hazard assessments are in good agreement, showing that it is possible to assess rockfall activities over large areas from DEM-based parameters and topographical elements.