778 resultados para Maths lessons execution
Resumo:
Provenance plays a major role when understanding and reusing the methods applied in a scientic experiment, as it provides a record of inputs, the processes carried out and the use and generation of intermediate and nal results. In the specic case of in-silico scientic experiments, a large variety of scientic workflow systems (e.g., Wings, Taverna, Galaxy, Vistrails) have been created to support scientists. All of these systems produce some sort of provenance about the executions of the workflows that encode scientic experiments. However, provenance is normally recorded at a very low level of detail, which complicates the understanding of what happened during execution. In this paper we propose an approach to automatically obtain abstractions from low-level provenance data by finding common workflow fragments on workflow execution provenance and relating them to templates. We have tested our approach with a dataset of workflows published by the Wings workflow system. Our results show that by using these kinds of abstractions we can highlight the most common abstract methods used in the executions of a repository, relating different runs and workflow templates with each other.
Resumo:
This article presents in an informal way some early results on the design of a series of paradigms for visualization of the parallel execution of logic programs. The results presented here refer to the visualization of or-parallelism, as in MUSE and Aurora, deterministic dependent and-parallelism, as in Andorra-I, and independent and-parallelism as in &-Prolog. A tool has been implemented for this purpose and has been interfaced with these systems. Results are presented showing the visualization of executions from these systems and the usefulness of the resulting tool is briefly discussed.
Resumo:
In this paper, we examine the issue of memory management in the parallel execution of logic programs. We concentrate on non-deterministic and-parallel schemes which we believe present a relatively general set of problems to be solved, including most of those encountered in the memory management of or-parallel systems. We present a distributed stack memory management model which allows flexible scheduling of goals. Previously proposed models (based on the "Marker model") are lacking in that they impose restrictions on the selection of goals to be executed or they may require consume a large amount of virtual memory. This paper first presents results which imply that the above mentioned shortcomings can have significant performance impacts. An extension of the Marker Model is then proposed which allows flexible scheduling of goals while keeping (virtual) memory consumption down. Measurements are presented which show the advantage of this solution. Methods for handling forward and backward execution, cut and roll back are discussed in the context of the proposed scheme. In addition, the paper shows how the same mechanism for flexible scheduling can be applied to allow the efficient handling of the very general form of suspension that can occur in systems which combine several types of and-parallelism and more sophisticated methods of executing logic programs. We believe that the results are applicable to many and- and or-parallel systems.
Resumo:
Spain has a long tradition of encouraging toll highways by granting concessions to private companies. Concessions in Spain have been characterized by a willingness to transfer considerable risk to the private sector. Traffic demand, acquisition of the right-of-way, and financial risk have often been allocated to the private sector. From 1996 to 2011, 16 toll highway concessions, covering a total distance of 835 km, were awarded by the central government of Spain with this approach. Some of those highways started their operations just before the economic recession began. The recession had negative consequences for Spain's economy. The gross domestic product per capita plummeted, and the unemployment rate increased from 9% to 20% of the working population in just 2 years. The recession also had severe consequences for the economic performance of toll highway concessions. Traffic levels declined at a much greater rate than did the gross domestic product. In addition, the conditions imposed by the financial markets on borrowers became much stricter because of the liquidity crisis. This study analyzes the impact that the economic recession ultimately had on the performance of toll highway concessions in Spain and the actions that the government adopted to avoid the bankruptcy of the concessionaires. It was found that the economic recession helped identify some deficiencies in how risk had been allocated in Spain. The measures that both Spain and the European Union are adopting so as to improve risk allocation are discussed.
Resumo:
Expert systems are built from knowledge traditionally elicited from the human expert. It is precisely knowledge elicitation from the expert that is the bottleneck in expert system construction. On the other hand, a data mining system, which automatically extracts knowledge, needs expert guidance on the successive decisions to be made in each of the system phases. In this context, expert knowledge and data mining discovered knowledge can cooperate, maximizing their individual capabilities: data mining discovered knowledge can be used as a complementary source of knowledge for the expert system, whereas expert knowledge can be used to guide the data mining process. This article summarizes different examples of systems where there is cooperation between expert knowledge and data mining discovered knowledge and reports our experience of such cooperation gathered from a medical diagnosis project called Intelligent Interpretation of Isokinetics Data, which we developed. From that experience, a series of lessons were learned throughout project development. Some of these lessons are generally applicable and others pertain exclusively to certain project types.
Resumo:
this paper analyzes the singularities inherent to the financial industry, in relation to other businesses, and its implications to financial crises throughout history. The efficient markets hypothesis is questioned, and its impact on the deregulation of the financial system is analyzed. Finally, the causes of the current crisis are investigated, and the general lines to be addressed for the redesign of a financial system to achieve an efficient and equitable capitalism are suggested.
Resumo:
The Smartcity Málaga project is one of Europe?s largest ecoefficient city initiatives. The project has implemented a field trial in 50 households to study the effects of energy monitoring and management technologies on the residential electricity consumption. This poster presents some lessons learned on energy consumption trends, smart clamps reliability and the suitability of power contracted by users, obtained after six months of data analysis.
Resumo:
The effects of nature on people's mind have been an active research theme for decades. However, the impact of people's mind on landscape ecological health has received less attention. How and why perception, meanings and mental constructs determine the way nature is valued and consequently managed? How this interplay should be? These are in some cases more relevant questions than knowing what particular landscapes are preferred (Carlson 1993). This was the underlying inquiry in the focus group experience held in a natural protected area in La Rioja (Spain). Participants were asked to locate in a map areas representing low/high quality in terms of ecology and aesthetics. Some relevant conclusions for landscape management were derived from the analysis of participant's discourse in terms of ecological aesthetical appreciation and their consideration about how human takes place in nature.
Resumo:
The Privacy by Design approach to systems engineering introduces privacy requirements in the early stages of development, instead of patching up a built system afterwards. However, 'vague', 'disconnected from technology', or 'aspirational' are some terms employed nowadays to refer to the privacy principles which must lead the development process. Although privacy has become a first-class citizen in the realm of non-functional requirements and some methodological frameworks help developers by providing design guidance, software engineers often miss a solid reference detailing which specific, technical requirements they must abide by, and a systematic methodology to follow. In this position paper, we look into a domain that has already successfully tackled these problems -web accessibility-, and propose translating their findings into the realm of privacy requirements engineering, analyzing as well the gaps not yet covered by current privacy initiatives.
Resumo:
Logic programming (LP) is a family of high-level programming languages which provides high expressive power. With LP, the programmer writes the properties of the result and / or executable specifications instead of detailed computation steps. Logic programming systems which feature tabled execution and constraint logic programming have been shown to increase the declarativeness and efficiency of Prolog, while at the same time making it possible to write very expressive programs. Tabled execution avoids infinite failure in some cases, while improving efficiency in programs which repeat computations. CLP reduces the search tree and brings the power of solving (in)equations over arbitrary domains. Similarly to the LP case, CLP systems can also benefit from the power of tabling. Previous implementations which take ful advantage of the ideas behind tabling (e.g., forcing suspension, answer subsumption, etc. wherever it is necessary to avoid recomputation and terminate whenever possible) did not offer a simple, well-documented, easy-to-understand interface. This would be necessary to make the integratation of arbitrary CLP solvers into existing tabling systems possible. This clearly hinders a more widespread usage of the combination of both facilities. In this thesis we examine the requirements that a constraint solver must fulfill in order to be interfaced with a tabling system. We propose and implement a framework, which we have called Mod TCLP, with a minimal set of operations (e.g., entailment checking and projection) which the constraint solver has to provide to the tabling engine. We validate the design of Mod TCLP by a series of use cases: we re-engineer a previously existing tabled constrain domain (difference constraints) which was connected in an ad-hoc manner with the tabling engine in Ciao Prolog; we integrateHolzbauer’s CLP(Q) implementationwith Ciao Prolog’s tabling engine; and we implement a constraint solver over (finite) lattices. We evaluate its performance with several benchmarks that implement a simple abstract interpreter whose fixpoint is reached by means of tabled execution, and whose domain operations are handled by the constraint over (finite) lattices, where TCLP avoids recomputing subsumed abstractions.---ABSTRACT---La programación lógica con restricciones (CLP) y la tabulación son extensiones de la programación lógica que incrementan la declaratividad y eficiencia de Prolog, al mismo tiempo que hacen posible escribir programasmás expresivos. Las implementaciones anteriores que integran completamente ambas extensiones, incluyendo la suspensión de la ejecución de objetivos siempre que sea necesario, la implementación de inclusión (subsumption) de respuestas, etc., en todos los puntos en los que sea necesario para evitar recomputaciones y garantizar la terminación cuando sea posible, no han proporcionan una interfaz simple, bien documentada y fácil de entender. Esta interfaz es necesaria para permitir integrar resolutores de CLP arbitrarios en el sistema de tabulación. Esto claramente dificulta un uso más generalizado de la integración de ambas extensiones. En esta tesis examinamos los requisitos que un resolutor de restricciones debe cumplir para ser integrado con un sistema de tabulación. Proponemos un esquema (y su implementación), que hemos llamadoMod TCLP, que requiere un reducido conjunto de operaciones (en particular, y entre otras, entailment y proyección de almacenes de restricciones) que el resolutor de restricciones debe ofrecer al sistema de tabulación. Hemos validado el diseño de Mod TCLP con una serie de casos de uso: la refactorización de un sistema de restricciones (difference constraints) previamente conectado de un modo ad-hoc con la tabulación de Ciao Prolog; la integración del sistema de restricciones CLP(Q) de Holzbauer; y la implementación de un resolutor de restricciones sobre retículos finitos. Hemos evaluado su rendimiento con varios programas de prueba, incluyendo la implementación de un intérprete abstracto que alcanza su punto fijo mediante el sistema de tabulación y en el que las operaciones en el dominio son realizadas por el resolutor de restricciones sobre retículos (finitos) donde TCLP evita la recomputación de valores abstractos de las variables ya contenidos en llamadas anteriores.
Resumo:
Technology transfer (TT) in the area of renewable energy (RE) throughout history has been an important tool for rural development (RD). Initially, the TT has been conceptualized as the purchase or donation of machinery from first world countries - without any consideration of staff training and contextual conditions for the adaptation of technology to the needs of the country. Various researches have revealed the existence of different approaches to planning the TT of RE, demonstrating the high complexity of projects from the social and contextual dimension. This paper addresses the conceptual evolution of the TT of RE for RD, examining its different periods considered for three criteria: historical events occurred, the role of stakeholders and changing objectives for the TT of RE for RD. For the conceptual analysis of changes the model Working With People (WWP) is used for planning and project management of high social complexity in RD. The analysis defines the existence of four historical periods in the TT of RE and synthesizes the lessons of experience from the three dimensions (ethical-social, technical-entrepreneurial, and political-contextual) of the WWP model.
Resumo:
La reproducibilidad de estudios y resultados científicos es una meta a tener en cuenta por cualquier científico a la hora de publicar el producto de una investigación. El auge de la ciencia computacional, como una forma de llevar a cabo estudios empíricos haciendo uso de modelos matemáticos y simulaciones, ha derivado en una serie de nuevos retos con respecto a la reproducibilidad de dichos experimentos. La adopción de los flujos de trabajo como método para especificar el procedimiento científico de estos experimentos, así como las iniciativas orientadas a la conservación de los datos experimentales desarrolladas en las últimas décadas, han solucionado parcialmente este problema. Sin embargo, para afrontarlo de forma completa, la conservación y reproducibilidad del equipamiento computacional asociado a los flujos de trabajo científicos deben ser tenidas en cuenta. La amplia gama de recursos hardware y software necesarios para ejecutar un flujo de trabajo científico hace que sea necesario aportar una descripción completa detallando que recursos son necesarios y como estos deben de ser configurados. En esta tesis abordamos la reproducibilidad de los entornos de ejecución para flujos de trabajo científicos, mediante su documentación usando un modelo formal que puede ser usado para obtener un entorno equivalente. Para ello, se ha propuesto un conjunto de modelos para representar y relacionar los conceptos relevantes de dichos entornos, así como un conjunto de herramientas que hacen uso de dichos módulos para generar una descripción de la infraestructura, y un algoritmo capaz de generar una nueva especificación de entorno de ejecución a partir de dicha descripción, la cual puede ser usada para recrearlo usando técnicas de virtualización. Estas contribuciones han sido aplicadas a un conjunto representativo de experimentos científicos pertenecientes a diferentes dominios de la ciencia, exponiendo cada uno de ellos diferentes requisitos hardware y software. Los resultados obtenidos muestran la viabilidad de propuesta desarrollada, reproduciendo de forma satisfactoria los experimentos estudiados en diferentes entornos de virtualización. ABSTRACT Reproducibility of scientific studies and results is a goal that every scientist must pursuit when announcing research outcomes. The rise of computational science, as a way of conducting empirical studies by using mathematical models and simulations, have opened a new range of challenges in this context. The adoption of workflows as a way of detailing the scientific procedure of these experiments, along with the experimental data conservation initiatives that have been undertaken during last decades, have partially eased this problem. However, in order to fully address it, the conservation and reproducibility of the computational equipment related to them must be also considered. The wide range of software and hardware resources required to execute a scientific workflow implies that a comprehensive description detailing what those resources are and how they are arranged is necessary. In this thesis we address the issue of reproducibility of execution environments for scientific workflows, by documenting them in a formalized way, which can be later used to obtain and equivalent one. In order to do so, we propose a set of semantic models for representing and relating the relevant information of those environments, as well as a set of tools that uses these models for generating a description of the infrastructure, and an algorithmic process that consumes these descriptions for deriving a new execution environment specification, which can be enacted into a new equivalent one using virtualization solutions. We apply these three contributions to a set of representative scientific experiments, belonging to different scientific domains, and exposing different software and hardware requirements. The obtained results prove the feasibility of the proposed approach, by successfully reproducing the target experiments under different virtualization environments.
Resumo:
The French CEA, together with EDF and the IAEA, recently organised an international benchmark to evaluate the ability to model the mechanical behaviour of a typical nuclear reinforced concrete structure subjected to seismic demands. The participants were provided with descriptions of the structure and the testing campaign; they had to propose the numerical model and the material laws for the concrete (stage #1). A mesh of beam and shell elements was generated; for modelling the concrete a damaged plasticity model was used, but a smeared crack model was also investigated. Some of the initial experimental results, with the mock-up remaining in the elastic range, were provided to the participants for calibrating their models (stage #2). Predictions had to be produced in terms of eigen-frequencies and motion time histories. The calculated frequencies reproduced reasonably the experimental ones; the time histories, calculated by modal response analysis, also reproduced adequately the observed amplifications. The participants were then expected to predict the structural response under strong ground motions (stage #3), which increased progressively up to a history recorded during the 1994 Northridge earthquake, followed by an aftershock. These results were produced using an explicit solver and a damaged plasticity model for the concrete, although an implicit solver with a smeared crack model was also investigated. The paper presents the conclusions of the pre-test exercise, as well as some observations from additional simulations conducted after the experimental results were made available.
Resumo:
The innocent Job suffers, friends are no help, and then Job screams at Yahweh, demanding justice. The first surprise is that Yehweh responds to Job, does not criticize him but tells him he is ignorant, and then gives him a science lesson. The content of that lesson is the second surprise.
Resumo:
Peer reviewed