994 resultados para compositional approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study applies spatial statistical techniques including cokriging to integrate airborne geophysical (radiometric) data with ground-based measurements of peat depth and soil organic carbon (SOC) to monitor change in peat cover for carbon stock calculations. The research is part of the EU funded Tellus Border project and is supported by the INTERREG IVA development programme of the European Regional Development Fund, which is managed by the Special EU Programmes Body (SEUPB). The premise is that saturated peat attenuates the radiometric signal from underlying soils and rocks. Contemporaneous ground-based measurements were collected to corroborate mapped estimates and develop a statistical model for volumetric carbon content (VCC) to 0.5 metres. Field measurements included ground penetrating radar, gamma ray spectrometry and a soil sampling methodology which measured bulk density and soil moisture to determine VCC. One aim of the study was to explore whether airborne radiometric survey data can be used to establish VCC across a region. To account for the footprint of airborne radiometric data, five cores were obtained at each soil sampling location: one at the centre of the ground radiometric equivalent sample location and one at each of the four corners 20 metres apart. This soil sampling strategy replicated the methodology deployed for the Tellus Border geochemistry survey. Two key issues will be discussed from this work. The first addresses the integration of different sampling supports for airborne and ground measured data and the second discusses the compositional nature of the VOC data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Isotopic data are currently becoming an important source of information regarding sources, evolution and mixing processes of water in hydrogeologic systems. However, it is not clear how to treat with statistics the geochemical data and the isotopic data together. We propose to introduce the isotopic information as new parts, and apply compositional data analysis with the resulting increased composition. Results are equivalent to downscale the classical isotopic delta variables, because they are already relative (as needed in the compositional framework) and isotopic variations are almost always very small. This methodology is illustrated and tested with the study of the Llobregat River Basin (Barcelona, NE Spain), where it is shown that, though very small, isotopic variations comp lement geochemical principal components, and help in the better identification of pollution sources

Relevância:

100.00% 100.00%

Publicador:

Resumo:

System compositional approach to model construction and research of informational processes, which take place in biological hierarchical neural networks, is being discussed. A computer toolbox has been successfully developed for solution of tasks from this scientific sphere. A series of computational experiments investigating the work of this toolbox on olfactory bulb model has been carried out. The well-known psychophysical phenomena have been reproduced in experiments.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis presents a novel approach to building large-scale agent-based models of networked physical systems using a compositional approach to provide extensibility and flexibility in building the models and simulations. A software framework (MODAM - MODular Agent-based Model) was implemented for this purpose, and validated through simulations. These simulations allow assessment of the impact of technological change on the electricity distribution network looking at the trajectories of electricity consumption at key locations over many years.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gesture in performance is widely acknowledged in the literature as an important element in making a performance expressive and meaningful. The body has been shown to play an important role in the production and perception of vocal performance in particular. This paper is interested in the role of gesture in creative works that seek to extend vocal performance via technology. A creative work for vocal performer, laptop computer and a Human Computer Interface called the eMic (Extended Microphone Stand Interface controller) is presented as a case study, to explore the relationships between movement, voice production, and musical expression. The eMic is an interface for live vocal performance that allows the singers’ gestures and interactions with a sensor based microphone stand to be captured and mapped to musical parameters. The creative work discussed in this paper presents a new compositional approach for the eMic by working with movement as a starting point for the composition and thus using choreographed gesture as the basis for musical structures. By foregrounding the body and movement in the creative process, the aim is to create a more visually engaging performance where the performer is able to more effectively use the body to express their musical objectives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La version intégrale de ce mémoire est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l'Université de Montréal (www.bib.umontreal.ca/MU).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La version intégrale de ce mémoire est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (www.bib.umontreal.ca/MU).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La version intégrale de ce mémoire est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (www.bib.umontreal.ca/MU).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a type inference algorithm, in the style of compositional analysis, for the language TRAFFIC—a specification language for flow composition applications proposed in [2]—and prove that this algorithm is correct: the typings it infers are principal typings, and the typings agree with syntax-directed type checking on closed flow specifications. This algorithm is capable of verifying partial flow specifications, which is a significant improvement over syntax-directed type checking algorithm presented in [3]. We also show that this algorithm runs efficiently, i.e., in low-degree polynomial time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Irish and UK governments, along with other countries, have made a commitment to limit the concentrations of greenhouse gases in the atmosphere by reducing emissions from the burning of fossil fuels. This can be achieved (in part) through increasing the sequestration of CO2 from the atmosphere including monitoring the amount stored in vegetation and soils. A large proportion of soil carbon is held within peat due to the relatively high carbon density of peat and organic-rich soils. This is particularly important for a country such as Ireland, where some 16% of the land surface is covered by peat. For Northern Ireland, it has been estimated that the total amount of carbon stored in vegetation is 4.4Mt compared to 386Mt stored within peat and soils. As a result it has become increasingly important to measure and monitor changes in stores of carbon in soils. The conservation and restoration of peat covered areas, although ongoing for many years, has become increasingly important. This is summed up in current EU policy outlined by the European Commission (2012) which seeks to assess the relative contributions of the different inputs and outputs of organic carbon and organic matter to and from soil. Results are presented from the EU-funded Tellus Border Soil Carbon Project (2011 to 2013) which aimed to improve current estimates of carbon in soil and peat across Northern Ireland and the bordering counties of the Republic of Ireland.
Historical reports and previous surveys provide baseline data. To monitor change in peat depth and soil organic carbon, these historical data are integrated with more recently acquired airborne geophysical (radiometric) data and ground-based geochemical data generated by two surveys, the Tellus Project (2004-2007: covering Northern Ireland) and the EU-funded Tellus Border project (2011-2013) covering the six bordering counties of the Republic of Ireland, Donegal, Sligo, Leitrim, Cavan, Monaghan and Louth. The concept being applied is that saturated organic-rich soil and peat attenuate gamma-radiation from underlying soils and rocks. This research uses the degree of spatial correlation (coregionalization) between peat depth, soil organic carbon (SOC) and the attenuation of the radiometric signal to update a limited sampling regime of ground-based measurements with remotely acquired data. To comply with the compositional nature of the SOC data (perturbations of loss on ignition [LOI] data), a compositional data analysis approach is investigated. Contemporaneous ground-based measurements allow corroboration for the updated mapped outputs. This provides a methodology that can be used to improve estimates of soil carbon with minimal impact to sensitive habitats (like peat bogs), but with maximum output of data and knowledge.