36 resultados para Branching Processes in Varying Environments
em Universidad Politécnica de Madrid
Resumo:
The set agreement problem states that from n proposed values at most n?1 can be decided. Traditionally, this problem is solved using a failure detector in asynchronous systems where processes may crash but do not recover, where processes have different identities, and where all processes initially know the membership. In this paper we study the set agreement problem and the weakest failure detector L used to solve it in asynchronous message passing systems where processes may crash and recover, with homonyms (i.e., processes may have equal identities) and without a complete initial knowledge of the membership.
Resumo:
First, this paper describes a future layered Air Traffic Management (ATM) system centred in the execution phase of flights. The layered ATM model is based on the work currently performed by SESAR [1] and takes into account the availability of accurate and updated flight information ?seen by all? across the European airspace. This shared information of each flight will be referred as Reference Business Trajectory (RBT). In the layered ATM system, exchanges of information will involve several actors (human or automatic), which will have varying time horizons, areas of responsibility and tasks. Second, the paper will identify the need to define the negotiation processes required to agree revisions to the RBT in the layered ATM system. Third, the final objective of the paper is to bring to the attention of researchers and engineers the communalities between multi-player games and Collaborative Decision Making processes (CDM) in a layered ATM system
Resumo:
Stream-mining approach is defined as a set of cutting-edge techniques designed to process streams of data in real time, in order to extract knowledge. In the particular case of classification, stream-mining has to adapt its behaviour to the volatile underlying data distributions, what has been called concept drift. Moreover, it is important to note that concept drift may lead to situations where predictive models become invalid and have therefore to be updated to represent the actual concepts that data poses. In this context, there is a specific type of concept drift, known as recurrent concept drift, where the concepts represented by data have already appeared in the past. In those cases the learning process could be saved or at least minimized by applying a previously trained model. This could be extremely useful in ubiquitous environments that are characterized by the existence of resource constrained devices. To deal with the aforementioned scenario, meta-models can be used in the process of enhancing the drift detection mechanisms used by data stream algorithms, by representing and predicting when the change will occur. There are some real-world situations where a concept reappears, as in the case of intrusion detection systems (IDS), where the same incidents or an adaptation of them usually reappear over time. In these environments the early prediction of drift by means of a better knowledge of past models can help to anticipate to the change, thus improving efficiency of the model regarding the training instances needed. By means of using meta-models as a recurrent drift detection mechanism, the ability to share concepts representations among different data mining processes is open. That kind of exchanges could improve the accuracy of the resultant local model as such model may benefit from patterns similar to the local concept that were observed in other scenarios, but not yet locally. This would also improve the efficiency of training instances used during the classification process, as long as the exchange of models would aid in the application of already trained recurrent models, that have been previously seen by any of the collaborative devices. Which it is to say that the scope of recurrence detection and representation is broaden. In fact the detection, representation and exchange of concept drift patterns would be extremely useful for the law enforcement activities fighting against cyber crime. Being the information exchange one of the main pillars of cooperation, national units would benefit from the experience and knowledge gained by third parties. Moreover, in the specific scope of critical infrastructures protection it is crucial to count with information exchange mechanisms, both from a strategical and technical scope. The exchange of concept drift detection schemes in cyber security environments would aid in the process of preventing, detecting and effectively responding to threads in cyber space. Furthermore, as a complement of meta-models, a mechanism to assess the similarity between classification models is also needed when dealing with recurrent concepts. In this context, when reusing a previously trained model a rough comparison between concepts is usually made, applying boolean logic. The introduction of fuzzy logic comparisons between models could lead to a better efficient reuse of previously seen concepts, by applying not just equal models, but also similar ones. This work faces the aforementioned open issues by means of: the MMPRec system, that integrates a meta-model mechanism and a fuzzy similarity function; a collaborative environment to share meta-models between different devices; a recurrent drift generator that allows to test the usefulness of recurrent drift systems, as it is the case of MMPRec. Moreover, this thesis presents an experimental validation of the proposed contributions using synthetic and real datasets.
Resumo:
Augmented reality (AR) is been increasingly used in mobile devices. Most of the available applications are set to work outdoors, mainly due to the availability of a reliable positioning system. Nevertheless, indoor (smart) spaces offer a lot of opportunities of creating new service concepts. In particular, in this paper we explore the applicability of mobile AR to hospitality environments (hotels and similar establishments). From the state-of-the-art of technologies and applications, a portfolio of services has been identified and a prototype using off-the-shelf technologies has been designed. Our objective is to identify the next technological challenges to overcome in order to have suitable underlying infrastructures and innovative services which enhance the traveller?s experience.
Resumo:
The aim of this paper is to clarify the role played by the most commonly used viscous terms in simulating viscous laminar flows using the weakly compressible approach in the context of smooth particle hydrodynamics (WCSPH). To achieve this, Takeda et al. (Prog. Theor. Phys. 1994; 92(5):939–960), Morris et al. (J. Comput. Phys. 1997; 136:214–226) and Monaghan–Cleary–Gingold's (Appl. Math. Model. 1998; 22(12):981–993; Monthly Notices of the Royal Astronomical Society 2005; 365:199–213) viscous terms will be analysed, discussing their origins, structures and conservation properties. Their performance will be monitored with canonical flows of which related viscosity phenomena are well understood, and in which boundary effects are not relevant. Following the validation process of three previously published examples, two vortex flows of engineering importance have been studied. First, an isolated Lamb–Oseen vortex evolution where viscous effects are dominant and second, a pair of co-rotating vortices in which viscous effects are combined with transport phenomena. The corresponding SPH solutions have been compared to finite-element numerical solutions. The SPH viscosity model's behaviour in modelling the viscosity related effects for these canonical flows is adequate
Resumo:
Oxygen 1s excitation and ionization processes in the CO2 molecule have been studied with dispersed and non-dispersed fluorescence spectroscopy as well as with the vacuum ultraviolet (VUV) photon?photoion coincidence technique. The intensity of the neutral O emission line at 845 nm shows particular sensitivity to core-to-Rydberg excitations and core?valence double excitations, while shape resonances are suppressed. In contrast, the partial fluorescence yield in the wavelength window 300?650 nm and the excitation functions of selected O+ and C+ emission lines in the wavelength range 400?500 nm display all of the absorption features. The relative intensity of ionic emission in the visible range increases towards higher photon energies, which is attributed to O 1s shake-off photoionization. VUV photon?photoion coincidence spectra reveal major contributions from the C+ and O+ ions and a minor contribution from C2+. No conclusive changes in the intensity ratios among the different ions are observed above the O 1s threshold. The line shape of the VUV?O+ coincidence peak in the mass spectrum carries some information on the initial core excitation
Resumo:
Hydrology is the study of the properties, distribution and effects of water on the Earth?s soil, rocks and atmosphere. It also encompasses the study of the hydrologic cycle of precipitation, runoff, infiltration, storage, and evaporation, including the physical, biological and chemical reaction of water with the earth and its relation to life?.
Resumo:
Identification and tracking of objects in specific environments such as harbors or security areas is a matter of great importance nowadays. With this purpose, numerous systems based on different technologies have been developed, resulting in a great amount of gathered data displayed through a variety of interfaces. Such amount of information has to be evaluated by human operators in order to take the correct decisions, sometimes under highly critical situations demanding both speed and accuracy. In order to face this problem we describe IDT-3D, a platform for identification and tracking of vessels in a harbour environment able to represent fused information in real time using a Virtual Reality application. The effectiveness of using IDT-3D as an integrated surveillance system is currently under evaluation. Preliminary results point to a significant decrease in the times of reaction and decision making of operators facing up a critical situation. Although the current application focus of IDT-3D is quite specific, the results of this research could be extended to the identification and tracking of targets in other controlled environments of interest as coastlines, borders or even urban areas.
Resumo:
Synthetic Aperture Radar (SAR) images a target region reflectivity function in the multi-dimensional spatial domain of range and cross-range. SAR synthesizes a large aperture radar in order to achieve a finer azimuth resolution than the one provided by any on-board real antenna. Conventional SAR techniques assume a single reflection of transmitted waveforms from targets. Nevertheless, today¿s new scenes force SAR systems to work in urban environments. Consequently, multiple-bounce returns are added to directscatter echoes. We refer to these as ghost images, since they obscure true target image and lead to poor resolution. By analyzing the quadratic phase error (QPE), this paper demonstrates that Earth¿s curvature influences the defocusing degree of multipath returns. In addition to the QPE, other parameters such as integrated sidelobe ratio (ISLR), peak sidelobe ratio (PSLR), contrast (C) and entropy (E) provide us with the tools to identify direct-scatter echoes in images containing undesired returns coming from multipath.
Resumo:
La mayor parte de los entornos diseñados por el hombre presentan características geométricas específicas. En ellos es frecuente encontrar formas poligonales, rectangulares, circulares . . . con una serie de relaciones típicas entre distintos elementos del entorno. Introducir este tipo de conocimiento en el proceso de construcción de mapas de un robot móvil puede mejorar notablemente la calidad y la precisión de los mapas resultantes. También puede hacerlos más útiles de cara a un razonamiento de más alto nivel. Cuando la construcción de mapas se formula en un marco probabilístico Bayesiano, una especificación completa del problema requiere considerar cierta información a priori sobre el tipo de entorno. El conocimiento previo puede aplicarse de varias maneras, en esta tesis se presentan dos marcos diferentes: uno basado en el uso de primitivas geométricas y otro que emplea un método de representación cercano al espacio de las medidas brutas. Un enfoque basado en características geométricas supone implícitamente imponer un cierto modelo a priori para el entorno. En este sentido, el desarrollo de una solución al problema SLAM mediante la optimización de un grafo de características geométricas constituye un primer paso hacia nuevos métodos de construcción de mapas en entornos estructurados. En el primero de los dos marcos propuestos, el sistema deduce la información a priori a aplicar en cada caso en base a una extensa colección de posibles modelos geométricos genéricos, siguiendo un método de Maximización de la Esperanza para hallar la estructura y el mapa más probables. La representación de la estructura del entorno se basa en un enfoque jerárquico, con diferentes niveles de abstracción para los distintos elementos geométricos que puedan describirlo. Se llevaron a cabo diversos experimentos para mostrar la versatilidad y el buen funcionamiento del método propuesto. En el segundo marco, el usuario puede definir diferentes modelos de estructura para el entorno mediante grupos de restricciones y energías locales entre puntos vecinos de un conjunto de datos del mismo. El grupo de restricciones que se aplica a cada grupo de puntos depende de la topología, que es inferida por el propio sistema. De este modo, se pueden incorporar nuevos modelos genéricos de estructura para el entorno con gran flexibilidad y facilidad. Se realizaron distintos experimentos para demostrar la flexibilidad y los buenos resultados del enfoque propuesto. Abstract Most human designed environments present specific geometrical characteristics. In them, it is easy to find polygonal, rectangular and circular shapes, with a series of typical relations between different elements of the environment. Introducing this kind of knowledge in the mapping process of mobile robots can notably improve the quality and accuracy of the resulting maps. It can also make them more suitable for higher level reasoning applications. When mapping is formulated in a Bayesian probabilistic framework, a complete specification of the problem requires considering a prior for the environment. The prior over the structure of the environment can be applied in several ways; this dissertation presents two different frameworks, one using a feature based approach and another one employing a dense representation close to the measurements space. A feature based approach implicitly imposes a prior for the environment. In this sense, feature based graph SLAM was a first step towards a new mapping solution for structured scenarios. In the first framework, the prior is inferred by the system from a wide collection of feature based priors, following an Expectation-Maximization approach to obtain the most probable structure and the most probable map. The representation of the structure of the environment is based on a hierarchical model with different levels of abstraction for the geometrical elements describing it. Various experiments were conducted to show the versatility and the good performance of the proposed method. In the second framework, different priors can be defined by the user as sets of local constraints and energies for consecutive points in a range scan from a given environment. The set of constraints applied to each group of points depends on the topology, which is inferred by the system. This way, flexible and generic priors can be incorporated very easily. Several tests were carried out to demonstrate the flexibility and the good results of the proposed approach.
Resumo:
A geochemical model of an urban environment is presented in which multielemental tracers are used to characterise the circulation of trace elements in particulate matter_atmospheric aerosol, street dust and urban soil, within a city.
Resumo:
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.
Resumo:
In ubiquitous data stream mining applications, different devices often aim to learn concepts that are similar to some extent. In these applications, such as spam filtering or news recommendation, the data stream underlying concept (e.g., interesting mail/news) is likely to change over time. Therefore, the resultant model must be continuously adapted to such changes. This paper presents a novel Collaborative Data Stream Mining (Coll-Stream) approach that explores the similarities in the knowledge available from other devices to improve local classification accuracy. Coll-Stream integrates the community knowledge using an ensemble method where the classifiers are selected and weighted based on their local accuracy for different partitions of the feature space. We evaluate Coll-Stream classification accuracy in situations with concept drift, noise, partition granularity and concept similarity in relation to the local underlying concept. The experimental results show that Coll-Stream resultant model achieves stability and accuracy in a variety of situations using both synthetic and real world datasets.
Resumo:
The integration of correlation processes in design systems has as a target measurements in 3D directly and according to the users criteria in order to generate the required database for the development of the project. In the phase of photogrammetric works, internal and external orientation parameters are calculated and stereo models are created from standard images. The aforementioned are integrated in the system where the measurement of the selected items is done by applying developed correlation algorithms. The processing period has the tools to carry out the calculations in an easy and automatic way, as well as image measurement techniques to acquire the most correct information. The proposed software development is done on Visual Studio platforms for PC, applying the most apt codes and symbols according to the terms of reference required for the design. The results of generating the data base in an interactive way with the geometric study of the structures, facilitates and improves the quality of the works in the projects.
Resumo:
En la actualidad, las técnicas de crioconservación poseen una importancia creciente para el almacenamiento a largo plazo de germoplasma vegetal. En las dos últimas décadas, estos métodos experimentaron un gran desarrollo y se han elaborado protocolos adecuados a diferentes sistemas vegetales, utilizando diversas estrategias como la vitrificación, la encapsulación-desecación con cuentas de alginato y el método de “droplet”-vitrificación. La presente tesis doctoral tiene como objetivo aumentar el conocimiento sobre los procesos implicados en los distintos pasos de un protocolo de crioconservación, en relación con el estado del agua presente en los tejidos y sus cambios, abordado mediante diversas técnicas biofísicas, principalmente calorimetría diferencial de barrido (DSC) y microscopía electrónica de barrido a baja temperatura (crio-SEM). En un primer estudio sobre estos métodos de crioconservación, se describen las fases de enfriamiento hasta la temperatura del nitrógeno líquido y de calentamiento hasta temperatura ambiente, al final del periodo de almacenamiento, que son críticas para la supervivencia del material crioconservado. Tanto enfriamiento como calentamiento deben ser realizados lo más rápidamente posible pues, aunque los bajos contenidos en agua logrados en etapas previas de los protocolos reducen significativamente las probabilidades de formación de hielo, éstas no son del todo nulas. En ese contexto, se analiza también la influencia de las velocidades de enfriamiento y calentamiento de las soluciones de crioconservación de plantas en sus parámetros termofísicos referente a la vitrificación, en relación su composición y concentración de compuestos. Estas soluciones son empleadas en la mayor parte de los protocolos actualmente utilizados para la crioconservación de material vegetal. Además, se estudia la influencia de otros factores que pueden determinar la estabilidad del material vitrificado, tales como en envejecimiento del vidrio. Se ha llevado a cabo una investigación experimental en el empleo del crio-SEM como una herramienta para visualizar el estado vítreo de las células y tejidos sometidos a los procesos de crioconservación. Se ha comparado con la más conocida técnica de calorimetría diferencial de barrido, obteniéndose resultados muy concordantes y complementarios. Se exploró también por estas técnicas el efecto sobre tejidos vegetales de la adaptación a bajas temperaturas y de la deshidratación inducida por los diferentes tratamientos utilizados en los protocolos. Este estudio permite observar la evolución biofísica de los sistemas en el proceso de crioconservación. Por último, se estudió la aplicación de películas de quitosano en las cuentas de alginato utilizadas en el protocolo de encapsulación. No se observaron cambios significativos en su comportamiento frente a la deshidratación, en sus parámetros calorimétricos y en la superficie de las cuentas. Su aplicación puede conferir propiedades adicionales prometedoras. ABSTRACT Currently, cryopreservation techniques have a growing importance for long term plant germplasm storage. These methods have undergone great progress during the last two decades, and adequate protocols for different plant systems have been developed, making use of diverse strategies, such as vitrification, encapsulation-dehydration with alginate beads and the dropletvitrification method. This PhD thesis has the goal of increasing the knowledge on the processes underlying the different steps of cryopreservation protocols, in relation with the state of water on tissues and its changes, approached through diverse biophysical techniques, especially differential scanning calorimetry (DSC) and low-temperature scanning electron microscopy (cryo-SEM). The processes of cooling to liquid nitrogen temperature and warming to room temperature, at the end of the storage period, critical for the survival of the cryopreserved material, are described in a first study on these cryopreservation methods. Both cooling and warming must be carried out as quickly as possible because, although the low water content achieved during previous protocol steps significantly reduces ice formation probability, it does not completely disappear. Within this context, the influence of plant vitrification solutions cooling and warming rate on their vitrification related thermophysical parameters is also analyzed, in relation to its composition and component concentration. These solutions are used in most of the currently employed plant material cryopreservation protocols. Additionally, the influence of other factors determining the stability of vitrified material is studied, such as glass aging. An experimental research work has been carried out on the use of cryo-SEM as a tool for visualizing the glassy state in cells and tissues, submitted to cryopreservation processes. It has been compared with the better known differential scanning calorimetry technique, and results in good agreement and complementary have been obtained. The effect on plant tissues of adaptation to low temperature and of the dehydration induced by the different treatments used in the protocols was explored also by these techniques. This study allows observation of the system biophysical evolution in the cryopreservation process. Lastly, the potential use of an additional chitosan film over the alginate beads used in encapsulation protocols was examined. No significant changes could be observed in its dehydration and calorimetric behavior, as well as in its surface aspect; its application for conferring additional properties to gel beads is promising.