111 resultados para Abstractions


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Models are an effective tool for systems and software design. They allow software architects to abstract from the non-relevant details. Those qualities are also useful for the technical management of networks, systems and software, such as those that compose service oriented architectures. Models can provide a set of well-defined abstractions over the distributed heterogeneous service infrastructure that enable its automated management. We propose to use the managed system as a source of dynamically generated runtime models, and decompose management processes into a composition of model transformations. We have created an autonomic service deployment and configuration architecture that obtains, analyzes, and transforms system models to apply the required actions, while being oblivious to the low-level details. An instrumentation layer automatically builds these models and interprets the planned management actions to the system. We illustrate these concepts with a distributed service update operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In arid countries worldwide, social conflicts between irrigation-based human development and the conservation of aquatic ecosystems are widespread and attract many public debates. This research focuses on the analysis of water and agricultural policies aimed at conserving groundwater resources and maintaining rurallivelihoods in a basin in Spain's central arid region. Intensive groundwater mining for irrigation has caused overexploitation of the basin's large aquifer, the degradation of reputed wetlands and has given rise to notable social conflicts over the years. With the aim of tackling the multifaceted socio-ecological interactions of complex water systems, the methodology used in this study consists in a novel integration into a common platform of an economic optimization model and a hydrology model WEAP (Water Evaluation And Planning system). This robust tool is used to analyze the spatial and temporal effects of different water and agricultural policies under different climate scenarios. It permits the prediction of different climate and policy outcomes across farm types (water stress impacts and adaptation), at basin's level (aquifer recovery), and along the policies’ implementation horizon (short and long run). Results show that the region's current quota-based water policies may contribute to reduce water consumption in the farms but will not be able to recover the aquifer and will inflict income losses to the rural communities. This situation would worsen in case of drought. Economies of scale and technology are evidenced as larger farms with cropping diversification and those equipped with modern irrigation will better adapt to water stress conditions. However, the long-term sustainability of the aquifer and the maintenance of rurallivelihoods will be attained only if additional policy measures are put in place such as the control of illegal abstractions and the establishing of a water bank. Within the policy domain, the research contributes to the new sustainable development strategy of the EU by concluding that, in water-scarce regions, effective integration of water and agricultural policies is essential for achieving the water protection objectives of the EU policies. Therefore, the design and enforcement of well-balanced region-specific polices is a major task faced by policy makers for achieving successful water management that will ensure nature protection and human development at tolerable social costs. From a methodological perspective, this research initiative contributes to better address hydrological questions as well as economic and social issues in complex water and human systems. Its integrated vision provides a valuable illustration to inform water policy and management decisions within contexts of water-related conflicts worldwide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Precise modeling of the program heap is fundamental for understanding the behavior of a program, and is thus of signiflcant interest for many optimization applications. One of the fundamental properties of the heap that can be used in a range of optimization techniques is the sharing relationships between the elements in an array or collection. If an analysis can determine that the memory locations pointed to by different entries of an array (or collection) are disjoint, then in many cases loops that traverse the array can be vectorized or transformed into a thread-parallel versión. This paper introduces several novel sharing properties over the concrete heap and corresponding abstractions to represent them. In conjunction with an existing shape analysis technique, these abstractions allow us to precisely resolve the sharing relations in a wide range of heap structures (arrays, collections, recursive data structures, composite heap structures) in a computationally efflcient manner. The effectiveness of the approach is evaluated on a set of challenge problems from the JOlden and SPECjvm98 suites. Sharing information obtained from the analysis is used to achieve substantial thread-level parallel speedups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several activities in service oriented computing, such as automatic composition, monitoring, and adaptation, can benefit from knowing properties of a given service composition before executing them. Among these properties we will focus on those related to execution cost and resource usage, in a wide sense, as they can be linked to QoS characteristics. In order to attain more accuracy, we formulate execution costs / resource usage as functions on input data (or appropriate abstractions thereof) and show how these functions can be used to make better, more informed decisions when performing composition, adaptation, and proactive monitoring. We present an approach to, on one hand, synthesizing these functions in an automatic fashion from the definition of the different orchestrations taking part in a system and, on the other hand, to effectively using them to reduce the overall costs of non-trivial service-based systems featuring sensitivity to data and possibility of failure. We validate our approach by means of simulations of scenarios needing runtime selection of services and adaptation due to service failure. A number of rebinding strategies, including the use of cost functions, are compared.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ciao is a public domain, next generation multi-paradigm programming environment with a unique set of features: Ciao offers a complete Prolog system, supporting ISO-Prolog, but its novel modular design allows both restricting and extending the language. As a result, it allows working with fully declarative subsets of Prolog and also to extend these subsets (or ISO-Prolog) both syntactically and semantically. Most importantly, these restrictions and extensions can be activated separately on each program module so that several extensions can coexist in the same application for different modules. Ciao also supports (through such extensions) programming with functions, higher-order (with predicate abstractions), constraints, and objects, as well as feature terms (records), persistence, several control rules (breadth-first search, iterative deepening, ...), concurrency (threads/engines), a good base for distributed execution (agents), and parallel execution. Libraries also support WWW programming, sockets, external interfaces (C, Java, TclTk, relational databases, etc.), etc. Ciao offers support for programming in the large with a robust module/object system, module-based separate/incremental compilation (automatically -no need for makefiles), an assertion language for declaring (optional) program properties (including types and modes, but also determinacy, non-failure, cost, etc.), automatic static inference and static/dynamic checking of such assertions, etc. Ciao also offers support for programming in the small producing small executables (including only those builtins used by the program) and support for writing scripts in Prolog. The Ciao programming environment includes a classical top-level and a rich emacs interface with an embeddable source-level debugger and a number of execution visualization tools. The Ciao compiler (which can be run outside the top level shell) generates several forms of architecture-independent and stand-alone executables, which run with speed, efficiency and executable size which are very competive with other commercial and academic Prolog/CLP systems. Library modules can be compiled into compact bytecode or C source files, and linked statically, dynamically, or autoloaded. The novel modular design of Ciao enables, in addition to modular program development, effective global program analysis and static debugging and optimization via source to source program transformation. These tasks are performed by the Ciao preprocessor ( ciaopp, distributed separately). The Ciao programming environment also includes lpdoc, an automatic documentation generator for LP/CLP programs. It processes Prolog files adorned with (Ciao) assertions and machine-readable comments and generates manuals in many formats including postscript, pdf, texinfo, info, HTML, man, etc. , as well as on-line help, ascii README files, entries for indices of manuals (info, WWW, ...), and maintains WWW distribution sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Provenance plays a major role when understanding and reusing the methods applied in a scientic experiment, as it provides a record of inputs, the processes carried out and the use and generation of intermediate and nal results. In the specic case of in-silico scientic experiments, a large variety of scientic workflow systems (e.g., Wings, Taverna, Galaxy, Vistrails) have been created to support scientists. All of these systems produce some sort of provenance about the executions of the workflows that encode scientic experiments. However, provenance is normally recorded at a very low level of detail, which complicates the understanding of what happened during execution. In this paper we propose an approach to automatically obtain abstractions from low-level provenance data by finding common workflow fragments on workflow execution provenance and relating them to templates. We have tested our approach with a dataset of workflows published by the Wings workflow system. Our results show that by using these kinds of abstractions we can highlight the most common abstract methods used in the executions of a repository, relating different runs and workflow templates with each other.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While workflow technology has gained momentum in the last decade as a means for specifying and enacting computational experiments in modern science, reusing and repurposing existing workflows to build new scientific experiments is still a daunting task. This is partly due to the difficulty that scientists experience when attempting to understand existing workflows, which contain several data preparation and adaptation steps in addition to the scientifically significant analysis steps. One way to tackle the understandability problem is through providing abstractions that give a high-level view of activities undertaken within workflows. As a first step towards abstractions, we report in this paper on the results of a manual analysis performed over a set of real-world scientific workflows from Taverna and Wings systems. Our analysis has resulted in a set of scientific workflow motifs that outline i) the kinds of data intensive activities that are observed in workflows (data oriented motifs), and ii) the different manners in which activities are implemented within workflows (workflow oriented motifs). These motifs can be useful to inform workflow designers on the good and bad practices for workflow development, to inform the design of automated tools for the generation of workflow abstractions, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Workflow technology continues to play an important role as a means for specifying and enacting computational experiments in modern science. Reusing and re-purposing workflows allow scientists to do new experiments faster, since the workflows capture useful expertise from others. As workflow libraries grow, scientists face the challenge of finding workflows appropriate for their task, understanding what each workflow does, and reusing relevant portions of a given workflow.We believe that workflows would be easier to understand and reuse if high-level views (abstractions) of their activities were available in workflow libraries. As a first step towards obtaining these abstractions, we report in this paper on the results of a manual analysis performed over a set of real-world scientific workflows from Taverna, Wings, Galaxy and Vistrails. Our analysis has resulted in a set of scientific workflow motifs that outline (i) the kinds of data-intensive activities that are observed in workflows (Data-Operation motifs), and (ii) the different manners in which activities are implemented within workflows (Workflow-Oriented motifs). These motifs are helpful to identify the functionality of the steps in a given workflow, to develop best practices for workflow design, and to develop approaches for automated generation of workflow abstractions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The economic evaluation of drought impacts is essential in order to define efficient and sustainable management and mitigation strategies. The aim of this study is to evaluate the economic impacts of a drought event on the agricultural sector and measure how they are transmitted from primary production to industrial output and related employment. We fit econometric models to determine the magnitude of the economic loss attributable to water storage. The direct impacts of drought on agricultural productivity are measured through a direct attribution model. Indirect impacts on agricultural employment and the agri-food industry are evaluated through a nested indirect attribution model. The transmission of water scarcity effects from agricultural production to macroeconomic variables is measured through chained elasticities. The models allow for differentiating the impacts deriving from water scarcity from other sources of economic losses. Results show that the importance of drought impacts are less relevant at the macroeconomic level, but are more significant for those activities directly dependent on water abstractions and precipitation. From a management perspective, implications of these findings are important to develop effective mitigation strategies to reduce drought risk exposure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La Ingeniería del Software Empírico (ISE) utiliza como herramientas los estudios empíricos para conseguir evidencias que ayuden a conocer bajo qué circunstancias es mejor usar una tecnología software en lugar de otra. La investigación en la que se enmarca este TFM explora si las intuiciones y/o preferencias de las personas que realizan las pruebas de software, son capaces de predecir la efectividad de tres técnicas de evaluación de código: lectura por abstracciones sucesivas, cobertura de decisión y partición en clases de equivalencia. Para conseguir dicho objetivo, se analizan los datos recogidos en un estudio empírico, realizado por las tutoras de este TFM. En el estudio empírico distintos sujetos aplican las tres técnicas de evaluación de código a tres programas distintos, a los que se les habían introducido una serie de faltas artificialmente. Los sujetos deben reportar los fallos encontrados en los programas, así como, contestar a una serie de preguntas sobre sus intuiciones y preferencias. A la hora de analizar los datos del estudio, se ha comprobado: 1) cuáles son sus intuiciones y preferencias (mediante el test estadístico X2 de Pearson); 2) si los sujetos cambian de opinión después de aplicar las técnicas (para ello se ha utilizado índice de Kappa, el Test de McNemar-Bowker y el Test de Stuart-Maxwell); 3) la consistencia de las distintas preguntas (mediante el índice de Kappa), comparando: intuiciones con intuiciones, preferencias con preferencias e intuiciones con preferencias; 4) Por último, si hay coincidencia entre las intuiciones y preferencias con la efectividad real obtenida (para ello se ha utilizado, el Modelo Lineal General con medidas repetidas). Los resultados muestran que, no hay una intuición clara ni tampoco una preferencia concreta, con respecto a los programas. Además aunque existen cambios de opinión después de aplicar las técnicas, no se encuentran evidencias claras para afirmar que la intuición y preferencias influyen en su efectividad. Finalmente, existen relaciones entre las intuiciones con intuiciones, preferencias con preferencias e intuiciones con preferencias, además esta relación es más notoria después de aplicar las técnicas. ----ABSTRACT----Empirical Software Engineering (ESE) uses empirical studies as a mean to generate evidences to help determine under what circumstances it is convenient to use a given software technology. This Master Thesis is part of a research that explores whether intuitions and/or preferences of testers, can be used to predict the effectiveness of three code evaluation techniques: reading by stepwise abstractions, decision coverage and equivalence partitioning. To achieve this goal, this Master Thesis analyzes the data collected in an empirical study run by the tutors. In the empirical study, different subjects apply three code evaluation techniques to three different programs. A series of faults were artificially introduced to the programs. Subjects are required to report the defects found in the programs, as well as answer a series of questions about their intuitions and preferences. The data analyses test: 1) what are the intuitions and preferences of the subjects (using the Pearson X2 test); 2) whether subjects change their minds after applying the techniques (using the Kappa coefficient, McNemar-Bowker test, and Stuart-Maxwell test); 3) the consistency of the different questions, comparing: intuitions versus intuitions, preferences versus preferences and preferences versus intuitions (using the Kappa coefficient); 4) finally, if intuitions and/or preferences predict the actual effectiveness obtained (using the General Linear Model, repeated measures). The results show that there is not clear intuition or particular preference with respect to the programs. Moreover, although there are changes of mind after applying the techniques, there are not clear evidences to claim that intuition and preferences influence their effectiveness. Finally, there is a relationship between the intuitions versus intuitions, preferences versus preferences and intuitions versus preferences; this relationship is more noticeable after applying the techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a theoretical framework and a case study for reusing the same conceptual and computational methodology for both temporal abstraction and linear (unidimensional) space abstraction, in a domain (evaluation of traffic-control actions) significantly different from the one (clinical medicine) in which the method was originally used. The method, known as knowledge-based temporal abstraction, abstracts high-level concepts and patterns from time-stamped raw data using a formal theory of domain-specific temporal-abstraction knowledge. We applied this method, originally used to interpret time-oriented clinical data, to the domain of traffic control, in which the monitoring task requires linear pattern matching along both space and time. First, we reused the method for creation of unidimensional spatial abstractions over highways, given sensor measurements along each highway measured at the same time point. Second, we reused the method to create temporal abstractions of the traffic behavior, for the same space segments, but during consecutive time points. We defined the corresponding temporal-abstraction and spatial-abstraction domain-specific knowledge. Our results suggest that (1) the knowledge-based temporal-abstraction method is reusable over time and unidimensional space as well as over significantly different domains; (2) the method can be generalized into a knowledge-based linear-abstraction method, which solves tasks requiring abstraction of data along any linear distance measure; and (3) a spatiotemporal-abstraction method can be assembled from two copies of the generalized method and a spatial-decomposition mechanism, and is applicable to tasks requiring abstraction of time-oriented data into meaningful spatiotemporal patterns over a linear, decomposable space, such as traffic over a set of highways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Una gestión más eficiente y equitativa del agua a escala de cuenca no se puede centrar exclusivamente en el recurso hídrico en sí, sino también en otras políticas y disciplinas científicas. Existe un consenso creciente de que, además de la consideración de las cambiantes condiciones climáticas, es necesaria una integración de ámbitos de investigación tales como la agronomía, planificación del territorio y ciencias políticas y económicas a fin de satisfacer de manera sostenible las demandas de agua por parte de la sociedad y del medio natural. La Política Agrícola Común (PAC) es el principal motor de cambio en las tendencias de paisajes rurales y sistemas agrícolas, pero el deterioro del medio ambiente es ahora una de las principales preocupaciones. Uno de los cambios más relevantes se ha producido con la expansión e intensificación del olivar en España, principalmente con nuevas zonas de regadío o la conversión de olivares de secano a sistemas en regadío. Por otra parte, el cambio de las condiciones climáticas podría ejercer un papel importante en las tendencias negativas de las aportaciones a los ríos, pero no queda claro el papel que podrían estar jugando los cambios de uso de suelo y cobertura vegetal sobre las tendencias negativas de caudal observadas. Esta tesis tiene como objetivo mejorar el conocimiento de los efectos de la producción agrícola, política agraria y cambios de uso de suelo y cobertura vegetal sobre las condiciones de calidad del agua, respuesta hidrológica y apropiación del agua por parte de la sociedad. En primer lugar, el estudio determina las tendencias existentes de nitratos y sólidos en suspensión en las aguas superficiales de la cuenca del río Guadalquivir durante el periodo de 1998 a 2009. Desde una perspectiva de política agraria, la investigación trata de evaluar mediante un análisis de datos de panel las principales variables, incluyendo la reforma de la PAC de 2003, que están teniendo una influencia en ambos indicadores de calidad. En segundo lugar, la apropiación del agua y el nivel de contaminación por nitratos debido a la producción del aceite de oliva en España se determinan con una evaluación de la huella hídrica (HH), teniendo en cuenta una variabilidad espacial y temporal a largo de las provincias españolas y entre 1997 y 2008. Por último, la tesis analiza los efectos de los cambios de uso de suelo y cobertura vegetal sobre las tendencias negativas observadas en la zona alta del Turia, cabecera de la cuenca del río Júcar, durante el periodo 1973-2008 mediante una modelización ecohidrológica. En la cuenca del Guadalquivir cerca del 20% de las estaciones de monitoreo muestran tendencias significativas, lineales o cuadráticas, para cada indicador de calidad de agua. La mayoría de las tendencias significativas en nitratos están aumentando, y la mayoría de tendencias cuadráticas muestran un patrón en forma de U. Los modelos de regresión de datos de panel muestran que las variables más importantes que empeoran ambos indicadores de calidad del agua son la intensificación de biomasa y las exportaciones de ambos indicadores de calidad procedentes de aguas arriba. En regiones en las que el abandono agrícola y/o desintensificación han tenido lugar han mejorado las condiciones de calidad del agua. Para los nitratos, el desacoplamiento de las subvenciones a la agricultura y la reducción de la cuantía de las subvenciones a tierras de regadío subyacen en la reducción observada de la concentración de nitratos. Las medidas de modernización de regadíos y el establecimiento de zonas vulnerables a nitratos reducen la concentración en subcuencas que muestran una tendencia creciente de nitratos. Sin embargo, el efecto de las exportaciones de nitratos procedente de aguas arriba, la intensificación de la biomasa y los precios de los cultivos presentan un mayor peso, explicando la tendencia creciente observada de nitratos. Para los sólidos en suspensión, no queda de forma evidente si el proceso de desacoplamiento ha influido negativa o positivamente. Sin embargo, los mayores valores de las ayudas agrarias aún ligadas a la producción, en particular en zonas de regadío, conllevan un aumento de las tasas de erosión. Aunque la cuenca del Guadalquivir ha aumentado la producción agrícola y la eficiencia del uso del agua, el problema de las altas tasas de erosión aún no ha sido mitigado adecuadamente. El estudio de la huella hídrica (HH) revela que en 1 L de aceite de oliva español más del 99,5% de la HH está relacionado con la producción de la aceituna, mientras que menos del 0,5% se debe a otros componentes, es decir, a la botella, tapón y etiqueta. Durante el período estudiado, la HH verde en secano y en regadío representa alrededor del 72% y 12%, respectivamente, del total de la HH. Las HHs azul y gris representan 6% y 10%, respectivamente. La producción de aceitunas se concentra en regiones con una HH menor por unidad de producto. La producción de aceite de oliva ha aumentado su productividad del agua durante 1997-2008, incentivado por los crecientes precios del aceite, como también lo ha hecho la cantidad de exportaciones de agua virtual. De hecho, las mayores zonas productoras presentan una eficiencia alta del uso y de productividad del agua, así como un menor potencial de contaminación por nitratos. Pero en estas zonas se ve a la vez reflejado un aumento de presión sobre los recursos hídricos locales. El aumento de extracciones de agua subterránea relacionadas con las exportaciones de aceite de oliva podría añadir una mayor presión a la ya estresada cuenca del Guadalquivir, mostrando la necesidad de equilibrar las fuerzas del mercado con los recursos locales disponibles. Los cambios de uso de suelo y cobertura vegetal juegan un papel importante en el balance del agua de la cuenca alta del Turia, pero no son el principal motor que sustenta la reducción observada de caudal. El aumento de la temperatura es el principal factor que explica las mayores tasas de evapotranspiración y la reducción de caudales. Sin embargo, los cambios de uso de suelo y el cambio climático han tenido un efecto compensatorio en la respuesta hidrológica. Por un lado, el caudal se ha visto afectado negativamente por el aumento de la temperatura, mientras que los cambios de uso de suelo y cobertura vegetal han compensado positivamente con una reducción de las tasas de evapotranspiración, gracias a los procesos de disminución de la densidad de matorral y de degradación forestal. El estudio proporciona una visión que fortalece la interdisciplinariedad entre la planificación hidrológica y territorial, destacando la necesidad de incluir las implicaciones de los cambios de uso de suelo y cobertura vegetal en futuros planes hidrológicos. Estos hallazgos son valiosos para la gestión de la cuenca del río Turia, y el enfoque empleado es útil para la determinación del peso de los cambios de uso de suelo y cobertura vegetal en la respuesta hidrológica en otras regiones. ABSTRACT Achieving a more efficient and equitable water management at catchment scale does not only rely on the water resource itself, but also on other policies and scientific knowledge. There is a growing consensus that, in addition to consideration of changing climate conditions, integration with research areas such as agronomy, land use planning and economics and political science is required to meet sustainably the societal and environmental water demands. The Common Agricultural Policy (CAP) is a main driver for trends in rural landscapes and agricultural systems, but environmental deterioration is now a principal concern. One of the most relevant changes has occurred with the expansion and intensification of olive orchards in Spain, taking place mainly with new irrigated areas or with the conversion from rainfed to irrigated systems. Moreover, changing climate conditions might exert a major role on water yield trends, but it remains unclear the role that ongoing land use and land cover changes (LULCC) might have on observed river flow trends. This thesis aims to improve the understanding of the effects of agricultural production, policies and LULCC on water quality conditions, hydrological response and human water appropriation. Firstly, the study determines the existing trends for nitrates and suspended solids in the Guadalquivir river basin’s surface waters (south Spain) during the period from 1998 to 2009. From a policy perspective, the research tries to assess with panel data analysis the main drivers, including the 2003 CAP reform, which are having an influence on both water quality indicators. Secondly, water appropriation and nitrate pollution level originating from the production of olive oil in Spain is determined with a water footprint (WF) assessment, considering a spatial temporal variability across the Spanish provinces and from 1997 to 2008 years. Finally, the thesis analyzes the effects of the LULCC on the observed negative trends over the period 1973-2008 in the Upper Turia basin, headwaters of the Júcar river demarcation (east Spain), with ecohydrological modeling. In the Guadalquivir river basin about 20% of monitoring stations show significant trends, linear or quadratic, for each water quality indicator. Most significant trends of nitrates are augmenting than decreasing, and most significant quadratic terms of both indicators exhibit U-shaped patterns. The panel data models show that the most important drivers that are worsening nitrates and suspended solids in the basin are biomass intensification and exports of both water quality indicators from upland regions. In regions that agricultural abandonment and/or de-intensification have taken place the water quality conditions have improved. For nitrates, the decoupling of agricultural subsidies and the reduction of the amount of subsidies to irrigated land underlie the observed reduction of nitrates concentration. Measures of irrigation modernization and establishment of vulnerable zones to nitrates ameliorate the concentration of nitrates in subbasins showing an increasing trend. However, the effect of nitrates load from upland areas, intensification of biomass and crop prices present a greater weight leading to the final increasing trend in this subbasins group, where annual crops dominate. For suspended solids, there is no clear evidence that decoupling process have influenced negatively or positively. Nevertheless, greater values of subsidies still linked to production, particularly in irrigated regions, lead to increasing erosion rates. Although agricultural production has augmented in the basin and water efficiency in the agricultural sector has improved, the issue of high erosion rates has not yet been properly faced. The water footprint (WF) assessment reveals that for 1 L Spanish olive oil more than 99.5% of the WF is related to the olive fruit production, whereas less than 0.5% is due to other components i.e. bottle, cap and label. Over the studied period, the green WF in rainfed and irrigated systems represents about 72% and 12%, respectively, of the total WF. Blue and grey WFs represent 6% and 10%, respectively. The olive production is concentrated in regions with the smallest WF per unit of product. The olive oil production has increased its apparent water productivity from 1997 to 2008 incentivized by growing trade prices, but also did the amount of virtual water exports. In fact, the largest producing areas present high water use efficiency per product and apparent water productivity as well as less nitrates pollution potential, but this enhances the pressure on the available water resources. Increasing groundwater abstractions related to olive oil exports may add further pressure to the already stressed Guadalquivir basin. This shows the need to balance the market forces with the available local resources. Concerning the effects of LULCC on the Upper Turia basin’s streamflow, LULCC play a significant role on the water balance, but it is not the main driver underpinning the observed reduction on Turia's streamflow. Increasing mean temperature is the main factor supporting larger evapotranspiration rates and streamflow reduction. In fact, LULCC and climate change have had an offsetting effect on the streamflow generation during the study period. While streamflow has been negatively affected by increasing temperature, ongoing LULCC have positively compensated with reduced evapotranspiration rates, thanks to mainly shrubland clearing and forest degradation processes. The research provides insight for strengthening the interdisciplinarity between hydrological and spatial planning, highlighting the need to include the implications of LULCC in future hydrological plans. These findings are valuable for the management of the Turia river basin, as well as a useful approach for the determination of the weight of LULCC on the hydrological response in other regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tesis estudia la reducción plena (‘full reduction’ en inglés) en distintos cálculos lambda. 1 En esencia, la reducción plena consiste en evaluar los cuerpos de las funciones en los lenguajes de programación funcional con ligaduras. Se toma el cálculo lambda clásico (i.e., puro y sin tipos) como el sistema formal que modela el paradigma de programación funcional. La reducción plena es una técnica fundamental cuando se considera a los programas como datos, por ejemplo para la optimización de programas mediante evaluación parcial, o cuando algún atributo del programa se representa a su vez por un programa, como el tipo en los demostradores automáticos de teoremas actuales. Muchas semánticas operacionales que realizan reducción plena tienen naturaleza híbrida. Se introduce formalmente la noción de naturaleza híbrida, que constituye el hilo conductor de todo el trabajo. En el cálculo lambda la naturaleza híbrida se manifiesta como una ‘distinción de fase’ en el tratamiento de las abstracciones, ya sean consideradas desde fuera o desde dentro de si mismas. Esta distinción de fase conlleva una estructura en capas en la que una semántica híbrida depende de una o más semánticas subsidiarias. Desde el punto de vista de los lenguajes de programación, la tesis muestra como derivar, mediante técnicas de transformación de programas, implementaciones de semánticas operacionales que reducen plenamente a partir de sus especificaciones. Las técnicas de transformación de programas consisten en transformaciones sintácticas que preservan la equivalencia semántica de los programas. Se ajustan las técnicas de transformación de programas existentes para trabajar con implementaciones de semánticas híbridas. Además, se muestra el impacto que tiene la reducción plena en las implementaciones que utilizan entornos. Los entornos son un ingrediente fundamental en las implementaciones realistas de una máquina abstracta. Desde el punto de vista de los sistemas formales, la tesis desvela una teoría novedosa para el cálculo lambda con paso por valor (‘call-by-value lambda calculus’ en inglés) que es consistente con la reducción plena. Dicha teoría induce una noción de equivalencia observacional que distingue más puntos que las teorías existentes para dicho cálculo. Esta contribución ayuda a establecer una ‘teoría estándar’ en el cálculo lambda con paso por valor que es análoga a la ‘teoría estándar’ del cálculo lambda clásico propugnada por Barendregt. Se presentan resultados de teoría de la demostración, y se sugiere como abordar el estudio de teoría de modelos. ABSTRACT This thesis studies full reduction in lambda calculi. In a nutshell, full reduction consists in evaluating the body of the functions in a functional programming language with binders. The classical (i.e., pure untyped) lambda calculus is set as the formal system that models the functional paradigm. Full reduction is a prominent technique when programs are treated as data objects, for instance when performing optimisations by partial evaluation, or when some attribute of the program is represented by a program itself, like the type in modern proof assistants. A notable feature of many full-reducing operational semantics is its hybrid nature, which is introduced and which constitutes the guiding theme of the thesis. In the lambda calculus, the hybrid nature amounts to a ‘phase distinction’ in the treatment of abstractions when considered either from outside or from inside themselves. This distinction entails a layered structure in which a hybrid semantics depends on one or more subsidiary semantics. From a programming languages standpoint, the thesis shows how to derive implementations of full-reducing operational semantics from their specifications, by using program transformations techniques. The program transformation techniques are syntactical transformations which preserve the semantic equivalence of programs. The existing program transformation techniques are adjusted to work with implementations of hybrid semantics. The thesis also shows how full reduction impacts the implementations that use the environment technique. The environment technique is a key ingredient of real-world implementations of abstract machines which helps to circumvent the issue with binders. From a formal systems standpoint, the thesis discloses a novel consistent theory for the call-by-value variant of the lambda calculus which accounts for full reduction. This novel theory entails a notion of observational equivalence which distinguishes more points than other existing theories for the call-by-value lambda calculus. This contribution helps to establish a ‘standard theory’ in that calculus which constitutes the analogous of the ‘standard theory’ advocated by Barendregt in the classical lambda calculus. Some prooftheoretical results are presented, and insights on the model-theoretical study are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The software engineering community has paid little attention to non-functional requirements, or quality attributes, compared with studies performed on capture, analysis and validation of functional requirements. This circumstance becomes more intense in the case of distributed applications. In these applications we have to take into account, besides the quality attributes such as correctness, robustness, extendibility, reusability, compatibility, efficiency, portability and ease of use, others like reliability, scalability, transparency, security, interoperability, concurrency, etc. In this work we will show how these last attributes are related to different abstractions that coexist in the problem domain. To achieve this goal, we have established a taxonomy of quality attributes of distributed applications and have determined the set of necessary services to support such attributes.