28 resultados para Information Systems and Management
Resumo:
Esta tesis estudia la monitorización y gestión de la Calidad de Experiencia (QoE) en los servicios de distribución de vídeo sobre IP. Aborda el problema de cómo prevenir, detectar, medir y reaccionar a las degradaciones de la QoE desde la perspectiva de un proveedor de servicios: la solución debe ser escalable para una red IP extensa que entregue flujos individuales a miles de usuarios simultáneamente. La solución de monitorización propuesta se ha denominado QuEM(Qualitative Experience Monitoring, o Monitorización Cualitativa de la Experiencia). Se basa en la detección de las degradaciones de la calidad de servicio de red (pérdidas de paquetes, disminuciones abruptas del ancho de banda...) e inferir de cada una una descripción cualitativa de su efecto en la Calidad de Experiencia percibida (silencios, defectos en el vídeo...). Este análisis se apoya en la información de transporte y de la capa de abstracción de red de los flujos codificados, y permite caracterizar los defectos más relevantes que se observan en este tipo de servicios: congelaciones, efecto de “cuadros”, silencios, pérdida de calidad del vídeo, retardos e interrupciones en el servicio. Los resultados se han validado mediante pruebas de calidad subjetiva. La metodología usada en esas pruebas se ha desarrollado a su vez para imitar lo más posible las condiciones de visualización de un usuario de este tipo de servicios: los defectos que se evalúan se introducen de forma aleatoria en medio de una secuencia de vídeo continua. Se han propuesto también algunas aplicaciones basadas en la solución de monitorización: un sistema de protección desigual frente a errores que ofrece más protección a las partes del vídeo más sensibles a pérdidas, una solución para minimizar el impacto de la interrupción de la descarga de segmentos de Streaming Adaptativo sobre HTTP, y un sistema de cifrado selectivo que encripta únicamente las partes del vídeo más sensibles. También se ha presentado una solución de cambio rápido de canal, así como el análisis de la aplicabilidad de los resultados anteriores a un escenario de vídeo en 3D. ABSTRACT This thesis proposes a comprehensive approach to the monitoring and management of Quality of Experience (QoE) in multimedia delivery services over IP. It addresses the problem of preventing, detecting, measuring, and reacting to QoE degradations, under the constraints of a service provider: the solution must scale for a wide IP network delivering individual media streams to thousands of users. The solution proposed for the monitoring is called QuEM (Qualitative Experience Monitoring). It is based on the detection of degradations in the network Quality of Service (packet losses, bandwidth drops...) and the mapping of each degradation event to a qualitative description of its effect in the perceived Quality of Experience (audio mutes, video artifacts...). This mapping is based on the analysis of the transport and Network Abstraction Layer information of the coded stream, and allows a good characterization of the most relevant defects that exist in this kind of services: screen freezing, macroblocking, audio mutes, video quality drops, delay issues, and service outages. The results have been validated by subjective quality assessment tests. The methodology used for those test has also been designed to mimic as much as possible the conditions of a real user of those services: the impairments to evaluate are introduced randomly in the middle of a continuous video stream. Based on the monitoring solution, several applications have been proposed as well: an unequal error protection system which provides higher protection to the parts of the stream which are more critical for the QoE, a solution which applies the same principles to minimize the impact of incomplete segment downloads in HTTP Adaptive Streaming, and a selective scrambling algorithm which ciphers only the most sensitive parts of the media stream. A fast channel change application is also presented, as well as a discussion about how to apply the previous results and concepts in a 3D video scenario.
Resumo:
The objective of this paper is to present a framework that can facilitate the university level learning process in the Project Management of different students who are enrolled in different universities in different locations and attending their own Project Management courses, but running a virtual experience in executing and managing projects. The framework includes both information systems and methodological procedures that are integrated in the information system, making it possible to assess learning performance.
Resumo:
Assets are interrelated in risk analysis methodologies for information systems promoted by international standards. This means that an attack on one asset can be propagated through the network and threaten an organization's most valuable assets. It is necessary to valuate all assets, the direct and indirect asset dependencies, as well as the probability of threats and the resulting asset degradation. These methodologies do not, however, consider uncertain valuations and use precise values on different scales, usually percentages. Linguistic terms are used by the experts to represent assets values, dependencies and frequency and asset degradation associated with possible threats. Computations are based on the trapezoidal fuzzy numbers associated with these linguistic terms.
Resumo:
We propose a fuzzy approach to deal with risk analysis for information systems. We extend MAGERIT methodology that valuates the asset dependencies to a fuzzy framework adding fuzzy linguistic terms to valuate the different elements (terminal asset values, asset dependencies as well as the probability of threats and the resulting asset degradation) in risk analysis. Computations are based on the trapezoidal fuzzy numbers associated with these linguistic terms and, finally, the results of these operations are translated into a linguistic term by means of a similarity function.
Resumo:
Nitrate leaching (NL) is an important N loss process in irrigated agriculture that imposes a cost on the farmer and the environment. A meta-analysis of published experimental results from agricultural irrigated systems was conducted to identify those strategies that have proven effective at reducing NL and to quantify the scale of reduction that can be achieved. Forty-four scientific articles were identified which investigated four main strategies (water and fertilizer management, use of cover crops and fertilizer technology) creating a database with 279 observations on NL and 166 on crop yield. Management practices that adjust water application to crop needs reduced NL by a mean of 80% without a reduction in crop yield. Improved fertilizer management reduced NL by 40%, and the best relationship between yield and NL was obtained when applying the recommended fertilizer rate. Replacing a fallow with a non-legume cover crop reduced NL by 50% while using a legume did not have any effect on NL. Improved fertilizer technology also decreased NL but was the least effective of the selected strategies. The risk of nitrate leaching from irrigated systems is high, but optimum management practices may mitigate this risk and maintain crop yields while enhancing environmental sustainability.
Resumo:
Application of nitrogen (N) fertilizers in agricultural soils increases the risk of N loss to the atmosphere in the form of ammonia (NH3), nitrous oxide (N2O) and nitric oxide (NO)and the water bodies as nitrate (NO3-). The implementation of agricultural management practices can affect these losses. In Mediterranean irrigation systems, the greatest losses of NO3-through leaching occur within the irrigation and the intercropperiod. One way to abate these losses during the intercrop period is the use of cover crops that absorb part of the residual N from the root zone (Gabriel and Quemada, 2011). Moreover, during the following crop, these species could be applied as amendments to the soil, providing both C and N to the soil. This effect of cover and catch crops on decreasing the pool of N potentially lost has focused primarily on NO3-leaching. The aim of this work was to evaluate the effect of cover crops on N2O emission during the in tercrop period in a maize system and its subsequent incorporation into the soil in the following maize crop.
Resumo:
One of the main problems in urban areas is the steady growth in car ownership and traffic levels. Therefore, the challenge of sustainability is focused on a shift of the demand for mobility from cars to collective means of transport. For this purpose, buses are a key element of the public transport systems. In this respect Real Time Passenger Information (RTPI) systems help people change their travel behaviour towards more sustainable transport modes. This paper provides an assessment methodology which evaluates how RTPI systems improve the quality of bus services performance in two European cities, Madrid and Bremerhaven. In the case of Madrid, bus punctuality has increased by 3%. Regarding the travellers perception, Madrid raised its quality of service by 6% while Bremerhaven increased by 13%. On the other hand, the users¿ perception of Public Transport (PT) image increased by 14%.
Resumo:
The Smartcity Málaga project is one of Europe?s largest ecoefficient city initiatives. The project has implemented a field trial in 50 households to study the effects of energy monitoring and management technologies on the residential electricity consumption. This poster presents some lessons learned on energy consumption trends, smart clamps reliability and the suitability of power contracted by users, obtained after six months of data analysis.
Resumo:
La nanotecnología es un área de investigación de reciente creación que trata con la manipulación y el control de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. A escala nanométrica, los materiales exhiben fenómenos físicos, químicos y biológicos singulares, muy distintos a los que manifiestan a escala convencional. En medicina, los compuestos miniaturizados a nanoescala y los materiales nanoestructurados ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, así como una mejora en la focalización del medicamento hacia la diana terapéutica, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales (desde el nivel de población hasta el nivel de célula) y, por tanto, cualquier flujo de trabajo en nanomedicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Desafortunadamente, la informática biomédica todavía no ha proporcionado el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, ni ha adaptado sus métodos y herramientas a este nuevo campo de investigación. En este contexto, la nueva área de la nanoinformática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Las observaciones expuestas previamente determinan el contexto de esta tesis doctoral, la cual se centra en analizar el dominio de la nanomedicina en profundidad, así como en el desarrollo de estrategias y herramientas para establecer correspondencias entre las distintas disciplinas, fuentes de datos, recursos computacionales y técnicas orientadas a la extracción de información y la minería de textos, con el objetivo final de hacer uso de los datos nanomédicos disponibles. El autor analiza, a través de casos reales, alguna de las tareas de investigación en nanomedicina que requieren o que pueden beneficiarse del uso de métodos y herramientas nanoinformáticas, ilustrando de esta forma los inconvenientes y limitaciones actuales de los enfoques de informática biomédica a la hora de tratar con datos pertenecientes al dominio nanomédico. Se discuten tres escenarios diferentes como ejemplos de actividades que los investigadores realizan mientras llevan a cabo su investigación, comparando los contextos biomédico y nanomédico: i) búsqueda en la Web de fuentes de datos y recursos computacionales que den soporte a su investigación; ii) búsqueda en la literatura científica de resultados experimentales y publicaciones relacionadas con su investigación; iii) búsqueda en registros de ensayos clínicos de resultados clínicos relacionados con su investigación. El desarrollo de estas actividades requiere el uso de herramientas y servicios informáticos, como exploradores Web, bases de datos de referencias bibliográficas indexando la literatura biomédica y registros online de ensayos clínicos, respectivamente. Para cada escenario, este documento proporciona un análisis detallado de los posibles obstáculos que pueden dificultar el desarrollo y el resultado de las diferentes tareas de investigación en cada uno de los dos campos citados (biomedicina y nanomedicina), poniendo especial énfasis en los retos existentes en la investigación nanomédica, campo en el que se han detectado las mayores dificultades. El autor ilustra cómo la aplicación de metodologías provenientes de la informática biomédica a estos escenarios resulta efectiva en el dominio biomédico, mientras que dichas metodologías presentan serias limitaciones cuando son aplicadas al contexto nanomédico. Para abordar dichas limitaciones, el autor propone un enfoque nanoinformático, original, diseñado específicamente para tratar con las características especiales que la información presenta a nivel nano. El enfoque consiste en un análisis en profundidad de la literatura científica y de los registros de ensayos clínicos disponibles para extraer información relevante sobre experimentos y resultados en nanomedicina —patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.—, seguido del desarrollo de mecanismos para estructurar y analizar dicha información automáticamente. Este análisis concluye con la generación de un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento y de test anotados manualmente—, el cual ha sido aplicado a la clasificación de registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nanodrogas y nanodispositivos de aquellos enfocados a testear productos farmacéuticos tradicionales. El presente trabajo pretende proporcionar los métodos necesarios para organizar, depurar, filtrar y validar parte de los datos nanomédicos existentes en la actualidad a una escala adecuada para la toma de decisiones. Análisis similares para otras tareas de investigación en nanomedicina ayudarían a detectar qué recursos nanoinformáticos se requieren para cumplir los objetivos actuales en el área, así como a generar conjunto de datos de referencia, estructurados y densos en información, a partir de literatura y otros fuentes no estructuradas para poder aplicar nuevos algoritmos e inferir nueva información de valor para la investigación en nanomedicina. ABSTRACT Nanotechnology is a research area of recent development that deals with the manipulation and control of matter with dimensions ranging from 1 to 100 nanometers. At the nanoscale, materials exhibit singular physical, chemical and biological phenomena, very different from those manifested at the conventional scale. In medicine, nanosized compounds and nanostructured materials offer improved drug targeting and efficacy with respect to traditional formulations, and reveal novel diagnostic and therapeutic properties. Nevertheless, the complexity of information at the nano level is much higher than the complexity at the conventional biological levels (from populations to the cell). Thus, any nanomedical research workflow inherently demands advanced information management. Unfortunately, Biomedical Informatics (BMI) has not yet provided the necessary framework to deal with such information challenges, nor adapted its methods and tools to the new research field. In this context, the novel area of nanoinformatics aims to build new bridges between medicine, nanotechnology and informatics, allowing the application of computational methods to solve informational issues at the wide intersection between biomedicine and nanotechnology. The above observations determine the context of this doctoral dissertation, which is focused on analyzing the nanomedical domain in-depth, and developing nanoinformatics strategies and tools to map across disciplines, data sources, computational resources, and information extraction and text mining techniques, for leveraging available nanomedical data. The author analyzes, through real-life case studies, some research tasks in nanomedicine that would require or could benefit from the use of nanoinformatics methods and tools, illustrating present drawbacks and limitations of BMI approaches to deal with data belonging to the nanomedical domain. Three different scenarios, comparing both the biomedical and nanomedical contexts, are discussed as examples of activities that researchers would perform while conducting their research: i) searching over the Web for data sources and computational resources supporting their research; ii) searching the literature for experimental results and publications related to their research, and iii) searching clinical trial registries for clinical results related to their research. The development of these activities will depend on the use of informatics tools and services, such as web browsers, databases of citations and abstracts indexing the biomedical literature, and web-based clinical trial registries, respectively. For each scenario, this document provides a detailed analysis of the potential information barriers that could hamper the successful development of the different research tasks in both fields (biomedicine and nanomedicine), emphasizing the existing challenges for nanomedical research —where the major barriers have been found. The author illustrates how the application of BMI methodologies to these scenarios can be proven successful in the biomedical domain, whilst these methodologies present severe limitations when applied to the nanomedical context. To address such limitations, the author proposes an original nanoinformatics approach specifically designed to deal with the special characteristics of information at the nano level. This approach consists of an in-depth analysis of the scientific literature and available clinical trial registries to extract relevant information about experiments and results in nanomedicine —textual patterns, common vocabulary, experiment descriptors, characterization parameters, etc.—, followed by the development of mechanisms to automatically structure and analyze this information. This analysis resulted in the generation of a gold standard —a manually annotated training or reference set—, which was applied to the automatic classification of clinical trial summaries, distinguishing studies focused on nanodrugs and nanodevices from those aimed at testing traditional pharmaceuticals. The present work aims to provide the necessary methods for organizing, curating and validating existing nanomedical data on a scale suitable for decision-making. Similar analysis for different nanomedical research tasks would help to detect which nanoinformatics resources are required to meet current goals in the field, as well as to generate densely populated and machine-interpretable reference datasets from the literature and other unstructured sources for further testing novel algorithms and inferring new valuable information for nanomedicine.
Resumo:
The paper describes some relevant results of an on-going research aiming to elaborate a methodology to help the mobility management in natural parks, compatible with their protection missions: it has been developed a procedure to reproduce the mobility-environment relationships in various operational conditions. The final purpose is the identification of: a) the effects of various choices in transport planning, both at long term and strategic level; b) the most effective policies of mobility management. The work is articulated in the following steps: 1) definition of protected area on the basis of ecological and socio-economic criteria and legislative constraints; 2) analysis of mobility needs in the protected areas; 3) reconstruction of the state of the art of mobility management in natural parks at European level; 4) analysis of used traffic flows measurement methods; 5) analysis of environmental impacts due to transport systems modelling (air pollution and noise only); 6) identification of mitigation measures to be potentially applied. The whole methodology has been tested and validated on Italian case studies: i) the concerned area has been zoned according to the land-use peculiarities; ii) the local situations of transport infrastructure (roads and parking), services (public transport systems) and rules (traffic regulations) have been mapped with references to physical and functional attributes; iii) the mobility, both systematic and touristic, has been represented in an origin-destination matrix. By means of an assignment model the flows have been distributed and the corresponding average speeds to quantify gaseous and noise emissions was calculated, the criticalities in the reference scenario have been highlighted, as well as some alternative scenarios, including both operational and infrastructural measures have been identified. The comparison between projects and reference scenario allowed the quantification of effects (variation of emissions) for each scenario and a selection of the most effective management actions to be taken.
Resumo:
The analysis of how tourists select their holiday destinations along with the factors determining their choices is very important for promoting tourism. In particular, transportation is supposed to have a great influence on the tourists’ decisions. The aim of this paper is to investigate the role of High Speed Rail (HSR) systems with respect to a destination choice. Two key tourist destinations in Europe namely Paris, and Madrid, have been chosen to identify the factors influencing this choice. On the basis of two surveys to obtain information from tourists, it has been found that the presence of architectural sites, the promotion quality of the destination itself, and the cultural and social events have an impact when making a destination choice. However the availability of the HSR systems affects the choice of Paris and Madrid as tourist destinations in a different way. For Paris, TGV is considered a real transport mode alternative among tourists. On the other hand, Madrid is chosen by tourists irrespective of the presence of an efficient HSR network. Data collected from the two surveys have been used for a further quantitative analysis. Regression models have been specified and parameters have been calibrated to identify the factors influencing holidaymakers to revisit Paris and Madrid and visit other tourist places accessible by HSR from these capitals
Resumo:
The final purpose is the identification of: a) the effects of various choices in transport planning, both at long term and strategic level; b) the most effective policies of mobility management. The preliminary work was articulated in the following steps: 1) definition of protected area on the basis of ecological and socio-economic criteria and legislative constraints; 2) analysis of mobility needs in the protected areas; 3) reconstruction of the state of the art of mobility management in natural parks at European level; 4) analysis of used traffic flows measurement methods; 5) analysis of environmental impacts due to transport systems modelling (limited to air pollution and noise); 6) identification of mitigation measures to the potentially applied. The whole methodology has been firstly tested on the case study of the National Park of ?Gran Sasso and Monti della Laga? and further validated on the National Park of ?Gargano?, both located Italy: i) the concerned area has been zoned according to the land-use peculiarities; ii) the local situations of transport infrastructure (roads and parking), services (public transport systems) and rules (traffic regulations) have been mapped with references to physical and functional attributes; iii) the mobility, both systematic and touristic, has been synthetically represented in an origin-destination matrix. By means of an assignment model it has been determined the distribution of flows and the corresponding average speeds to quantify gaseous and noise emissions. On this basis the environmental criticalities in the reference scenario have been highlighted, as well as some alternative scenarios including both operational and infrastructural measures have been identified. The comparison between the projects and the reference scenario allowed the quantification of the effects (variation of emissions) for each scenario and a selection of the most effective management actions to be taken.
Resumo:
This article has been extracted from the results of a thesis entitled “Potential bioelectricity production of the Madrid Community Agricultural Regions based on rye and triticale biomass.” The aim was, first, to quantify the potential of rye (Secale Cereale L.) and triticale ( Triticosecale Aestivum L.) biomass in each of the Madrid Community agricultural regions, and second, to locate the most suitable areas for the installation of power plants using biomass. At least 17,339.9 t d.m. of rye and triticale would be required to satisfy the biomass needs of a 2.2 MW power plant, (considering an efficiency of 21.5%, 8,000 expected operating hours/year and a biomass LCP of 4,060 kcal/kg for both crops), and 2,577 ha would be used (which represent 2.79% of the Madrid Community fallow dry land surface). Biomass yields that could be achieved in Madrid Community using 50% of the fallow dry land surface (46,150 ha representing 5.75% of the Community area), based on rye and triticale crops, are estimated at 84,855, 74,906, 70,109, 50,791, 13,481, and 943 t annually for the Campiña, Vegas, Sur Occidental, Área Metropolitana, Lozoya-Somosierra, and Guadarrama regions. The latter represents a bioelectricity potential of 10.77, 9.5, 8.9, 6.44, 1.71, and 0.12 MW, respectively.