985 resultados para Event-Driven Programming


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epidemiological studies have led to the hypothesis that major risk factors for developing diseases such as hypertension, cardiovascular disease and adult-onset diabetes are established during development. This developmental programming hypothesis proposes that exposure to an adverse stimulus or insult at critical, sensitive periods of development can induce permanent alterations in normal physiological processes that lead to increased disease risk later in life. For cancer, inheritance of a tumor suppressor gene defect confers a high relative risk for disease development. However, these defects are rarely 100% penetrant. Traditionally, gene-environment interactions are thought to contribute to the penetrance of tumor suppressor gene defects by facilitating or inhibiting the acquisition of additional somatic mutations required for tumorigenesis. The studies presented herein identify developmental programming as a distinctive type of gene-environment interaction that can enhance the penetrance of a tumor suppressor gene defect in adult life. Using rats predisposed to uterine leiomyoma due to a germ-line defect in one allele of the tuberous sclerosis complex 2 (Tsc-2) tumor suppressor gene, these studies show that early-life exposure to the xenoestrogen, diethylstilbestrol (DES), during development of the uterus increased tumor incidence, multiplicity and size in genetically predisposed animals, but failed to induce tumors in wild-type rats. Uterine leiomyomas are ovarian-hormone dependent tumors that develop from the uterine myometrium. DES exposure was shown to developmentally program the myometrium, causing increased expression of estrogen-responsive genes prior to the onset of tumors. Loss of function of the normal Tsc-2 allele remained the rate-limiting event for tumorigenesis; however, tumors that developed in exposed animals displayed an enhanced proliferative response to ovarian steroid hormones relative to tumors that developed in unexposed animals. Furthermore, the studies presented herein identify developmental periods during which target tissues are maximally susceptible to developmental programming. These data suggest that exposure to environmental factors during critical periods of development can permanently alter normal physiological tissue responses and thus lead to increased disease risk in genetically susceptible individuals. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Late Permian mass extinction event about 252 million years ago was the most severe biotic crisis of the past 500 million years and occurred during an episode of global warming. The loss of around two-thirds of marine genera is thought to have had substantial ecological effects, but the overall impacts on the functioning of marine ecosystems and the pattern of marine recovery are uncertain. Here we analyse the fossil occurrences of all known benthic marine invertebrate genera from the Permian and Triassic periods, and assign each to a functional group based on their inferred lifestyle. We show that despite the selective extinction of 62-74% of these genera, all but one functional group persisted through the crisis, indicating that there was no significant loss of functional diversity at the global scale. In addition, only one new mode of life originated in the extinction aftermath. We suggest that Early Triassic marine ecosystems were not as ecologically depauperate as widely assumed. Functional diversity was, however, reduced in particular regions and habitats, such as tropical reefs; at these smaller scales, recovery varied spatially and temporally, probably driven by migration of surviving groups. We find that marine ecosystems did not return to their pre-extinction state, and by the Middle Triassic greater functional evenness is recorded, resulting from the radiation of previously subordinate groups such as motile, epifaunal grazers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mid-Cretaceous is thought to be a greenhouse world with significantly higher atmospheric pCO2 and sea-surface temperatures as well as a much flatter latitudinal thermal gradient compared to the present. This time interval was punctuated by the Cenomanian/Turonian Oceanic Anoxic Event (OAE-2, ~ 93.5 Myr ago), an episode of global, massive organic carbon burial that likely resulted in a large and abrupt pCO2 decline. However, the climatic consequences of this pCO2 drop are yet poorly constrained. We determined the first, high-resolution sea-surface temperature (SST) record across OAE-2 from a deep-marine sedimentary sequence at Ocean Drilling Program (ODP) Site 1276 in the mid-latitudinal Newfoundland Basin, NW Atlantic. By employing the organic palaeothermometer TEX86, we found that SSTs across the OAE-2 interval were extremely high, but were punctuated by a remarkably large cooling (5-11 °C), which is synchronous with the 2.5-5.5 °C cooling in SST records from equatorial Atlantic sites, and the "Plenus Cold Event". Because this global cooling event is concurrent with increased organic carbon burial, it likely acted in response to the associated pCO2 drop. Our findings imply a substantial increase in the latitudinal SST gradient in the proto-North Atlantic during this period of global cooling and reduced atmospheric pCO2, suggesting a strong coupling between pCO2 and latitudinal thermal gradients under greenhouse climate conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oxygen minimum zones are expanding globally, and at present account for around 20-40% of oceanic nitrogen loss. Heterotrophic denitrification and anammox-anaerobic ammonium oxidation with nitrite-are responsible for most nitrogen loss in these low-oxygen waters. Anammox is particularly significant in the eastern tropical South Pacific, one of the largest oxygen minimum zones globally. However, the factors that regulate anammox-driven nitrogen loss have remained unclear. Here, we present a comprehensive nitrogen budget for the eastern tropical South Pacific oxygen minimum zone, using measurements of nutrient concentrations, experimentally determined rates of nitrogen transformation and a numerical model of export production. Anammox was the dominant mode of nitrogen loss at the time of sampling. Rates of anammox, and related nitrogen transformations, were greatest in the productive shelf waters, and tailed off with distance from the coast. Within the shelf region, anammox activity peaked in both upper and bottom waters. Overall, rates of nitrogen transformation, including anammox, were strongly correlated with the export of organic matter. We suggest that the sinking of organic matter, and thus the release of ammonium into the water column, together with benthic ammonium release, fuel nitrogen loss from oxygen minimum zones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There have been several previous proposals for the integration of Object Oriented Programming features into Logic Programming, resulting in much support theory and several language proposals. However, none of these proposals seem to have made it into the mainstream. Perhaps one of the reasons for these is that the resulting languages depart too much from the standard logic programming languages to entice the average Prolog programmer. Another reason may be that most of what can be done with object-oriented programming can already be done in Prolog through the meta- and higher-order programming facilities that the language includes, albeit sometimes in a more cumbersome way. In light of this, in this paper we propose an alternative solution which is driven by two main objectives. The first one is to include only those characteristics of object-oriented programming which are cumbersome to implement in standard Prolog systems. The second one is to do this in such a way that there is minimum impact on the syntax and complexity of the language, i.e., to introduce the minimum number of new constructs, declarations, and concepts to be learned. Finally, we would like the implementation to be as straightforward as possible, ideally based on simple source to source expansions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so called shared resources), rather than on the specific constructs of some programming language. Using a resource-centric rather than a language-centric approach has some benefits for both teachers and students. Besides the obvious advantage of being independent of the programming language, the models help in the early validation of concurrent software design, provide students and teachers with a lingua franca that greatly simplifies communication at the classroom and during supervision, and help in the automatic generation of tests for the practical assignments. This method has been in use, with slight variations, for some 15 years, surviving changes in the programming language and course length. In this article, we describe the components and structure of the current incarnation of the course?which uses Java as target language?and some tools used to support our method. We provide a detailed description of the different outcomes that the model-driven approach delivers (validation of the initial design, automatic generation of tests, and mechanical generation of code) from a teaching perspective. A critical discussion on the perceived advantages and risks of our approach follows, including some proposals on how these risks can be minimized. We include a statistical analysis to show that our method has a positive impact in the student ability to understand concurrency and to generate correct code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Online services are no longer isolated. The release of public APIs and technologies such as web hooks are allowing users and developers to access their information easily. Intelligent agents could use this information to provide a better user experience across services, connecting services with smart automatic. behaviours or actions. However, agent platforms are not prepared to easily add external sources such as web services, which hinders the usage of agents in the so-called Evented or Live Web. As a solution, this paper introduces an event-based architecture for agent systems, in accordance with the new tendencies in web programming. In particular, it is focused on personal agents that interact with several web services. With this architecture, called MAIA, connecting to new web services does not involve any modification in the platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. This thesis is framed in experimental software engineering. More concretely, it addresses the problems arisen when assessing process conformance in test-driven development experiments conducted by UPM's Experimental Software Engineering group. Process conformance was studied using the Eclipse's plug-in tool Besouro. It has been observed that Besouro does not work correctly in some circumstances. It creates doubts about the correction of the existing experimental data which render it useless. Aim. The main objective of this work is the identification and correction of Besouro's faults. A secondary goal is fixing the datasets already obtained in past experiments to the maximum possible extent. This way, existing experimental results could be used with confidence. Method. (1) Testing Besouro using different sequences of events (creation methods, assertions etc..) to identify the underlying faults. (2) Fix the code and (3) fix the datasets using code specially created for this purpose. Results. (1) We confirmed the existence of several fault in Besouro's code that affected to Test-First and Test-Last episode identification. These faults caused the incorrect identification of 20% of episodes. (2) We were able to fix Besouro's code. (3) The correction of existing datasets was possible, subjected to some restrictions (such us the impossibility of tracing code size increase to programming time. Conclusion. The results of past experiments dependent upon Besouro's data could no be trustable. We have the suspicion that more faults remain in Besouro's code, whose identification requires further analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, data mining is based on low-level specications of the employed techniques typically bounded to a specic analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Here, we propose a model-driven approach based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (via data-warehousing technology) and the analysis models for data mining (tailored to a specic platform). Thus, analysts can concentrate on the analysis problem via conceptual data-mining models instead of low-level programming tasks related to the underlying-platform technical details. These tasks are now entrusted to the model-transformations scaffolding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrity assurance of configuration data has a significant impact on microcontroller-based systems reliability. This is especially true when running applications driven by events which behavior is tightly coupled to this kind of data. This work proposes a new hybrid technique that combines hardware and software resources for detecting and recovering soft-errors in system configuration data. Our approach is based on the utilization of a common built-in microcontroller resource (timer) that works jointly with a software-based technique, which is responsible to periodically refresh the configuration data. The experiments demonstrate that non-destructive single event effects can be effectively mitigated with reduced overheads. Results show an important increase in fault coverage for SEUs and SETs, about one order of magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (M.S.)--University of Illinois at Urbana-Champaign.