18 resultados para Information content

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel measure to assess the presence of meso-scale structures in complex networks. This measure is based on the identi?cation of regular patterns in the adjacency matrix of the network, and on the calculation of the quantity of information lost when pairs of nodes are iteratively merged. We show how this measure is able to quantify several meso-scale structures, like the presence of modularity, bipartite and core-periphery con?gurations, or motifs. Results corresponding to a large set of real networks are used to validate its ability to detect non-trivial topological patterns.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data-related properties of the activities involved in a service composition can be used to facilitate several design-time and run-time adaptation tasks, such as service evolution, distributed enactment, and instance-level adaptation. A number of these properties can be expressed using a notion of sharing. We present an approach for automated inference of data properties based on sharing analysis, which is able to handle service compositions with complex control structures, involving loops and sub-workflows. The properties inferred can include data dependencies, information content, domain-defined attributes, privacy or confidentiality levels, among others. The analysis produces characterizations of the data and the activities in the composition in terms of minimal and maximal sharing, which can then be used to verify compliance of potential adaptation actions, or as supporting information in their generation. This sharing analysis approach can be used both at design time and at run time. In the latter case, the results of analysis can be refined using the composition traces (execution logs) at the point of execution, in order to support run-time adaptation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Social behavior is mainly based on swarm colonies, in which each individual shares its knowledge about the environment with other individuals to get optimal solutions. Such co-operative model differs from competitive models in the way that individuals die and are born by combining information of alive ones. This paper presents the particle swarm optimization with differential evolution algorithm in order to train a neural network instead the classic back propagation algorithm. The performance of a neural network for particular problems is critically dependant on the choice of the processing elements, the net architecture and the learning algorithm. This work is focused in the development of methods for the evolutionary design of artificial neural networks. This paper focuses in optimizing the topology and structure of connectivity for these networks

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the biggest challenges that software developers face is to make an accurate estimate of the project effort. Radial basis function neural networks have been used to software effort estimation in this work using NASA dataset. This paper evaluates and compares radial basis function versus a regression model. The results show that radial basis function neural network have obtained less Mean Square Error than the regression method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cultural content on the Web is available in various domains (cultural objects, datasets, geospatial data, moving images, scholarly texts and visual resources), concerns various topics, is written in different languages, targeted to both laymen and experts, and provided by different communities (libraries, archives museums and information industry) and individuals (Figure 1). The integration of information technologies and cultural heritage content on the Web is expected to have an impact on everyday life from the point of view of institutions, communities and individuals. In particular, collaborative environment scan recreate 3D navigable worlds that can offer new insights into our cultural heritage (Chan 2007). However, the main barrier is to find and relate cultural heritage information by end-users of cultural contents, as well as by organisations and communities managing and producing them. In this paper, we explore several visualisation techniques for supporting cultural interfaces, where the role of metadata is essential for supporting the search and communication among end-users (Figure 2). A conceptual framework was developed to integrate the data, purpose, technology, impact, and form components of a collaborative environment, Our preliminary results show that collaborative environments can help with cultural heritage information sharing and communication tasks because of the way in which they provide a visual context to end-users. They can be regarded as distributed virtual reality systems that offer graphically realised, potentially infinite, digital information landscapes. Moreover, collaborative environments also provide a new way of interaction between an end-user and a cultural heritage data set. Finally, the visualisation of metadata of a dataset plays an important role in helping end-users in their search for heritage contents on the Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management of certain populations requires the preservation of its pure genetic background. When, for different reasons, undesired alleles are introduced, the original genetic conformation must be recovered. The present study tested, through computer simulations, the power of recovery (the ability for removing the foreign information) from genealogical data. Simulated scenarios comprised different numbers of exogenous individuals taking partofthe founder population anddifferent numbers of unmanaged generations before the removal program started. Strategies were based on variables arising from classical pedigree analyses such as founders? contribution and partial coancestry. The ef?ciency of the different strategies was measured as the proportion of native genetic information remaining in the population. Consequences on the inbreeding and coancestry levels of the population were also evaluated. Minimisation of the exogenous founders? contributions was the most powerful method, removing the largest amount of genetic information in just one generation.However, as a side effect, it led to the highest values of inbreeding. Scenarios with a large amount of initial exogenous alleles (i.e. high percentage of non native founders), or many generations of mixing became very dif?cult to recover, pointing out the importance of being careful about introgression events in population

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of the bilingual and monolingual participation of the MIRACLE team in CLEF 2004 was to test the effect of combination approaches on information retrieval. The starting point was a set of basic components: stemming, transformation, filtering, generation of n-grams, weighting and relevance feedback. Some of these basic components were used in different combinations and order of application for document indexing and for query processing. A second order combination was also tested, mainly by averaging or selective combination of the documents retrieved by different approaches for a particular query.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the first set of experiments defined by the MIRACLE (Multilingual Information RetrievAl for the CLEf campaign) research group for some of the cross language tasks defined by CLEF. These experiments combine different basic techniques, linguistic-oriented and statistic-oriented, to be applied to the indexing and retrieval processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Commercial computer-aided design systems support the geometric definition of product, but they lack utilities to support initial design stages. Typical tasks such as customer need capture, functional requirement formalization, or design parameter definition are conducted in applications that, for instance, support ?quality function deployment? and ?failure modes and effects analysis? techniques. Such applications are noninteroperable with the computer-aided design systems, leading to discontinuous design information flows. This study addresses this issue and proposes a method to enhance the integration of design information generated in the early design stages into a commercial computer-aided design system. To demonstrate the feasibility of the approach adopted, a prototype application was developed and two case studies were executed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One medium-term strategy for helping in the management of complexity is the introduction of a conceptual complexity component in the very centre of university curricula. In very few areas is the growth of complexity as evident as in the information technologies (ITs), the focus of the work presented in the current paper. We have therefore developed an integrated way of tackling the specific field of information technologies by means of an approach,to complexity. The content of this paper describes the guidelines of our research effort, placing an emphasis on informatics. Concepts of complexity based on the system metaphor have been substantially drawn upon in this exercise and are thus presented in some detail. Also described is a didactic experiment conducted by the author and designed to provide a new and integrating approach to University curricula for future professionals. The students' "discovery" of complexity is the focal point of the experiment. The findings of this effort are encouraging and call for the continuation and expansion of this experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emergence of cloud datacenters enhances the capability of online data storage. Since massive data is stored in datacenters, it is necessary to effectively locate and access interest data in such a distributed system. However, traditional search techniques only allow users to search images over exact-match keywords through a centralized index. These techniques cannot satisfy the requirements of content based image retrieval (CBIR). In this paper, we propose a scalable image retrieval framework which can efficiently support content similarity search and semantic search in the distributed environment. Its key idea is to integrate image feature vectors into distributed hash tables (DHTs) by exploiting the property of locality sensitive hashing (LSH). Thus, images with similar content are most likely gathered into the same node without the knowledge of any global information. For searching semantically close images, the relevance feedback is adopted in our system to overcome the gap between low-level features and high-level features. We show that our approach yields high recall rate with good load balance and only requires a few number of hops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existe una creciente necesidad de hacer el mejor uso del agua para regadío. Una alternativa eficiente consiste en la monitorización del contenido volumétrico de agua (θ), utilizando sensores de humedad. A pesar de existir una gran diversidad de sensores y tecnologías disponibles, actualmente ninguna de ellas permite obtener medidas distribuidas en perfiles verticales de un metro y en escalas laterales de 0.1-1,000 m. En este sentido, es necesario buscar tecnologías alternativas que sirvan de puente entre las medidas puntuales y las escalas intermedias. Esta tesis doctoral se basa en el uso de Fibra Óptica (FO) con sistema de medida de temperatura distribuida (DTS), una tecnología alternativa de reciente creación que ha levantado gran expectación en las últimas dos décadas. Específicamente utilizamos el método de fibra calentada, en inglés Actively Heated Fiber Optic (AHFO), en la cual los cables de Fibra Óptica se utilizan como sondas de calor mediante la aplicación de corriente eléctrica a través de la camisa de acero inoxidable, o de un conductor eléctrico simétricamente posicionado, envuelto, alrededor del haz de fibra óptica. El uso de fibra calentada se basa en la utilización de la teoría de los pulsos de calor, en inglés Heated Pulsed Theory (HPP), por la cual el conductor se aproxima a una fuente de calor lineal e infinitesimal que introduce calor en el suelo. Mediante el análisis del tiempo de ocurrencia y magnitud de la respuesta térmica ante un pulso de calor, es posible estimar algunas propiedades específicas del suelo, tales como el contenido de humedad, calor específico (C) y conductividad térmica. Estos parámetros pueden ser estimados utilizando un sensor de temperatura adyacente a la sonda de calor [método simple, en inglés single heated pulsed probes (SHPP)], ó a una distancia radial r [método doble, en inglés dual heated pulsed probes (DHPP)]. Esta tesis doctoral pretende probar la idoneidad de los sistemas de fibra óptica calentada para la aplicación de la teoría clásica de sondas calentadas. Para ello, se desarrollarán dos sistemas FO-DTS. El primero se sitúa en un campo agrícola de La Nava de Arévalo (Ávila, España), en el cual se aplica la teoría SHPP para estimar θ. El segundo sistema se desarrolla en laboratorio y emplea la teoría DHPP para medir tanto θ como C. La teoría SHPP puede ser implementada con fibra óptica calentada para obtener medidas distribuidas de θ, mediante la utilización de sistemas FO-DTS y el uso de curvas de calibración específicas para cada suelo. Sin embargo, la mayoría de aplicaciones AHFO se han desarrollado exclusivamente en laboratorio utilizando medios porosos homogéneos. En esta tesis se utiliza el programa Hydrus 2D/3D para definir tales curvas de calibración. El modelo propuesto es validado en un segmento de cable enterrado en una instalación de fibra óptica y es capaz de predecir la respuesta térmica del suelo en puntos concretos de la instalación una vez que las propiedades físicas y térmicas de éste son definidas. La exactitud de la metodología para predecir θ frente a medidas puntuales tomadas con sensores de humedad comerciales fue de 0.001 a 0.022 m3 m-3 La implementación de la teoría DHPP con AHFO para medir C y θ suponen una oportunidad sin precedentes para aplicaciones medioambientales. En esta tesis se emplean diferentes combinaciones de cables y fuentes emisoras de calor, que se colocan en paralelo y utilizan un rango variado de espaciamientos, todo ello en el laboratorio. La amplitud de la señal y el tiempo de llegada se han observado como funciones del calor específico del suelo. Medidas de C, utilizando esta metodología y ante un rango variado de contenidos de humedad, sugirieron la idoneidad del método, aunque también se observaron importantes errores en contenidos bajos de humedad de hasta un 22%. La mejora del método requerirá otros modelos más precisos que tengan en cuenta el diámetro del cable, así como la posible influencia térmica del mismo. ABSTRACT There is an increasing need to make the most efficient use of water for irrigation. A good approach to make irrigation as efficient as possible is to monitor soil water content (θ) using soil moisture sensors. Although, there is a broad range of different sensors and technologies, currently, none of them can practically and accurately provide vertical and lateral moisture profiles spanning 0-1 m depth and 0.1-1,000 m lateral scales. In this regard, further research to fulfill the intermediate scale and to bridge single-point measurement with the broaden scales is still needed. This dissertation is based on the use of Fiber Optics with Distributed Temperature Sensing (FO-DTS), a novel approach which has been receiving growing interest in the last two decades. Specifically, we employ the so called Actively Heated Fiber Optic (AHFO) method, in which FO cables are employed as heat probe conductors by applying electricity to the stainless steel armoring jacket or an added conductor symmetrically positioned (wrapped) about the FO cable. AHFO is based on the classic Heated Pulsed Theory (HPP) which usually employs a heat probe conductor that approximates to an infinite line heat source which injects heat into the soil. Observation of the timing and magnitude of the thermal response to the energy input provide enough information to derive certain specific soil thermal characteristics such as the soil heat capacity, soil thermal conductivity or soil water content. These parameters can be estimated by capturing the soil thermal response (using a thermal sensor) adjacent to the heat source (the heating and the thermal sources are mounted together in the so called single heated pulsed probe (SHPP)), or separated at a certain distance, r (dual heated pulsed method (DHPP) This dissertation aims to test the feasibility of heated fiber optics to implement the HPP theory. Specifically, we focus on measuring soil water content (θ) and soil heat capacity (C) by employing two types of FO-DTS systems. The first one is located in an agricultural field in La Nava de Arévalo (Ávila, Spain) and employ the SHPP theory to estimate θ. The second one is developed in the laboratory using the procedures described in the DHPP theory, and focuses on estimating both C and θ. The SHPP theory can be implemented with actively heated fiber optics (AHFO) to obtain distributed measurements of soil water content (θ) by using reported soil thermal responses in Distributed Temperature Sensing (DTS) and with a soil-specific calibration relationship. However, most reported AHFO applications have been calibrated under laboratory homogeneous soil conditions, while inexpensive efficient calibration procedures useful in heterogeneous soils are lacking. In this PhD thesis, we employ the Hydrus 2D/3D code to define these soil-specific calibration curves. The model is then validated at a selected FO transect of the DTS installation. The model was able to predict the soil thermal response at specific locations of the fiber optic cable once the surrounding soil hydraulic and thermal properties were known. Results using electromagnetic moisture sensors at the same specific locations demonstrate the feasibility of the model to detect θ within an accuracy of 0.001 to 0.022 m3 m-3. Implementation of the Dual Heated Pulsed Probe (DPHP) theory for measurement of volumetric heat capacity (C) and water content (θ) with Distributed Temperature Sensing (DTS) heated fiber optic (FO) systems presents an unprecedented opportunity for environmental monitoring. We test the method using different combinations of FO cables and heat sources at a range of spacings in a laboratory setting. The amplitude and phase-shift in the heat signal with distance was found to be a function of the soil volumetric heat capacity (referred, here, to as Cs). Estimations of Cs at a range of θ suggest feasibility via responsiveness to the changes in θ (we observed a linear relationship in all FO combinations), though observed bias with decreasing soil water contents (up to 22%) was also reported. Optimization will require further models to account for the finite radius and thermal influence of the FO cables, employed here as “needle probes”. Also, consideration of the range of soil conditions and cable spacing and jacket configurations, suggested here to be valuable subjects of further study and development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to outline a theory-based Content and Language Integrated Learning course and to establish the rationale for adopting a holistic approach to the teaching of languages in tertiary education. Our work focuses on the interdependence between Content and Language Integrated Learning (CLIL), and the use of Information and Communication Technologies (ICT), in particular regarding the learning of English within the framework of Telecommunications Engineering. The study first analyses the diverse components of the instructional approach and the extent to which this approach interrelates with technologies within the context of what we have defined as a holistic experience, since it also aims to develop a set of generic competences or transferable skills. Second, an example of a course project framed in this holistic approach is described in order to exemplify the specific actions suggested for learner autonomy and CLIL. The approach provides both an adequate framework as well as the conditions needed to carry out a lifelong learning experience within our context, a Spanish School of Engineering. In addition to specialized language and content, the approach integrates the learning of skills and capacities required by the new plans that have been established following the Bologna Declaration in 1999.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to compare two types of data, subjective data (Polarity of Pan American Games 2011 event by country) and objective data (the number of medals won by each participating country), based on the Pearson corre- lation. When dealing with events described by people, knowledge acquisition is difficult because their structure is heterogeneous and subjective. A first step towards knowing the polarity of the information provided by people consists in automatically classifying the posts into clusters according to their polarity. The authors carried out a set of experiments using a corpus that consists of 5600 posts extracted from 168 Internet resources related to a specific event: the 2011 Pan American games. The approach is based on four components: a crawler, a filter, a synthesizer and a polarity analyzer. The PanAmerican approach automatically classifies the polarity of the event into clusters with the following results: 588 positive, 336 neutral, and 76 negative. Our work found out that the polarity of the content produced was strongly influenced by the results of the event with a correlation of .74. Thus, it is possible to conclude that the polarity of content is strongly affected by the results of the event. Finally, the accuracy of the PanAmerican approach is: .87, .90, and .80 according to the precision of the three classes of polarity evaluated.