21 resultados para The Haia Program
em Universidad Politécnica de Madrid
Resumo:
Many of the material models most frequently used for the numerical simulation of the behavior of concrete when subjected to high strain rates have been originally developed for the simulation of ballistic impact. Therefore, they are plasticity-based models in which the compressive behavior is modeled in a complex way, while their tensile failure criterion is of a rather simpler nature. As concrete elements usually fail in tensión when subjected to blast loading, available concrete material models for high strain rates may not represent accurately their real behavior. In this research work an experimental program of reinforced concrete fíat elements subjected to blast load is presented. Altogether four detonation tests are conducted, in which 12 slabs of two different concrete types are subjected to the same blast load. The results of the experimental program are then used for the development and adjustment of numerical tools needed in the modeling of concrete elements subjected to blast.
Resumo:
Introducción. El número de personas que padecen síndrome metabólico ha incrementado a nivel mundial durante las últimas dos décadas. Existen numerosos estudios que tratan de comparar prevalencias según los diferentes criterios y estimaciones del riesgo metabólico. De ellos se puede concluir que el principal hallazgo ha sido recalcar la necesidad de una definición estándar universal. A pesar de estas discrepancias no hay lugar a duda sobre el problema de salud pública que esto conlleva. Se necesitan medidas y estrategias urgentes para prevenir y controlar esta emergente epidemia global y para ello se debe prestar especial atención a los cambios en el estilo de vida, fundamentalmente dieta y ejercicio. A pesar de todo, existe a día de hoy una importante controversia sobre el tipo de ejercicio más efectivo y su combinación con la dieta para conseguir mejoras en la salud. Objetivos. Estudiar los índices de riesgo metabólico empleados en la literatura científica y las terapias basadas en dieta y ejercicio para el tratamiento de los factores del síndrome metabólico en adultos con sobrepeso. Diseño de investigación. Los datos empleados en el análisis de esta tesis son, primeramente un estudio piloto, y posteriormente parte del estudio “Programas de Nutrición y Actividad Física para el tratamiento de la obesidad” (PRONAF). El estudio PRONAF es un proyecto consistente en un estudio clínico sobre programas de nutrición y actividad física para el sobrepeso y la obesidad, desarrollado en España durante varios años de intervenciones. Fue diseñado, en parte, para tratar de comparar protocolos de entrenamiento de resistencia, cargas y combinado en igualdad de volumen e intensidad, con el objetivo de evaluar su impacto en los factores de riesgo y la prevalencia del síndrome metabólico en personas con sobrepeso y obesidad. El diseño experimental es un control aleatorio y el protocolo incluye 3 modos de ejercicio (entrenamiento de resistencia, con cargas y combinado) y restricción dietética sobre diversas variables determinantes del estado de salud. Las principales variables para la investigación que comprende esta tesis fueron: actividad física habitual, marcadores de grasa corporal, niveles de insulina, glucosa, triglicéridos, colesterol total, colesterol HDL, colesterol LDL, presión arterial y parámetros relacionados con el ejercicio. Conclusiones. A) Los índices de riesgo metabólico estudiados presentan resultados contradictorios en relación al riesgo metabólico en un individuo, dependiendo de los métodos matemáticos empleados para el cálculo y de las variables introducidas, tanto en mujeres sanas como en adultos en sobrepeso. B) El protocolo de entrenamiento combinado (de cargas y de resistencia) junto con la dieta equilibrada propuesto en este estudio fue la mejor estrategia para la mejora del riesgo de síndrome metabólico en adultos con sobrepeso. C) Los protocolos de entrenamiento supervisado de resistencia, con cargas y combinado junto con la restricción nutricional, no obtuvieron mejoras sobre el perfil lipídico, más allá de los cambios conseguidos con el protocolo de dieta y recomendaciones generales de actividad física habitual en clínica, en adultos con sobrepeso. Background. Over the past two decades, a striking increase in the number of people with the MetS worldwide has taken place. Many studies compare prevalences using different criteria and metabolic risk estimation formulas, and perhaps their main achievement is to reinforce the need for a standardized international definition. Although these discrepancies, there is no doubt it is a public health problem. There is urgent need for strategies to prevent and manage the emerging global epidemic, special consideration should be given to behavioral and lifestyle, mainly diet and exercise. However, there is still controversy about the most effective type of exercise and diet combination to achieve improvements. Objectives. To study the metabolic risk scores used in the literature and the diet and exercise therapies for the treatment of the MetS factors in overweight adults. Research design. The data used in the analysis was collected firstly in a pilot study and lately, as a part of the “Programas de Nutrición y Actividad física para el tratamiento de la obesidad” study (PRONAF). The PRONAF Study is a clinical research project in nutrition and physical activity programs for overweight and obesity, carried out in Spain (2008-2011). Was designed, in part, to attempt to match the volume and intensity of endurance, strength and combined training protocols in order to evaluate their impact on risk factors and MetS prevalence in overweight and obese people. The design and protocol included three exercise modes (endurance, strength and combined training) and diet restriction, in a randomized controlled trial concerning diverse health status variables. The main variables under investigation were habitual physical activity, markers of body fat, fasting serum levels of insulin, glucose, triglycerides, total, LDL and HDL cholesterol, blood pressure and diet and exercise parameters. Main outcomes. A) The metabolic risk scores studied presented contradictory results in relation to the metabolic risk of an individual, depending on the mathematical method used and the variables included, both in healthy women and overweight adults. B) The protocol proposed for combination of strength and endurance training combined with a balance diet was the optimal strategy for the improvement of MetS risk in overweight adults. C) The intervention program of endurance, strength or combined supervised training protocol with diet restriction did not achieved further improvements in lipid profile than a habitual clinical practice protocol including dietary advice and standard physical activity recommendations, in overweight adults.
Resumo:
Static analyses of object-oriented programs usually rely on intermediate representations that respect the original semantics while having a more uniform and basic syntax. Most of the work involving object-oriented languages and abstract interpretation usually omits the description of that language or just refers to the Control Flow Graph(CFG) it represents. However, this lack of formalization on one hand results in an absence of assurances regarding the correctness of the transformation and on the other it typically strongly couples the analysis to the source language. In this work we present a framework for analysis of object-oriented languages in which in a first phase we transform the input program into a representation based on Horn clauses. This allows on one hand proving the transformation correct attending to a simple condition and on the other being able to apply an existing analyzer for (constraint) logic programming to automatically derive a safe approximation of the semantics of the original program. The approach is flexible in the sense that the first phase decouples the analyzer from most languagedependent features, and correct because the set of Horn clauses returned by the transformation phase safely approximates the standard semantics of the input program. The resulting analysis is also reasonably scalable due to the use of mature, modular (C)LP-based analyzers. The overall approach allows us to report results for medium-sized programs.
Resumo:
We describe the current status of and provide performance results for a prototype compiler of Prolog to C, ciaocc. ciaocc is novel in that it is designed to accept different kinds of high-level information, typically obtained via an automatic analysis of the initial Prolog program and expressed in a standardized language of assertions. This information is used to optimize the resulting C code, which is then processed by an off-the-shelf C compiler. The basic translation process essentially mimics the unfolding of a bytecode emulator with respect to the particular bytecode corresponding to the Prolog program. This is facilitated by a flexible design of the instructions and their lower-level components. This approach allows reusing a sizable amount of the machinery of the bytecode emulator: predicates already written in C, data definitions, memory management routines and áreas, etc., as well as mixing emulated bytecode with native code in a relatively straightforward way. We report on the performance of programs compiled by the current versión of the system, both with and without analysis information.
Resumo:
This paper presents a technique for achieving a class of optimizations related to the reduction of checks within cycles. The technique uses both Program Transformation and Abstract Interpretation. After a ñrst pass of an abstract interpreter which detects simple invariants, program transformation is used to build a hypothetical situation that simpliñes some predicates that should be executed within the cycle. This transformation implements the heuristic hypothesis that once conditional tests hold they may continué doing so recursively. Specialized versions of predicates are generated to detect and exploit those cases in which the invariance may hold. Abstract interpretation is then used again to verify the truth of such hypotheses and conñrm the proposed simpliñcation. This allows optimizations that go beyond those possible with only one pass of the abstract interpreter over the original program, as is normally the case. It also allows selective program specialization using a standard abstract interpreter not speciñcally designed for this purpose, thus simplifying the design of this already complex module of the compiler. In the paper, a class of programs amenable to such optimization is presented, along with some examples and an evaluation of the proposed techniques in some application áreas such as floundering detection and reducing run-time tests in automatic logic program parallelization. The analysis of the examples presented has been performed automatically by an implementation of the technique using existing abstract interpretation and program transformation tools.
Resumo:
The relationship between abstract interpretation and partial deduction has received considerable attention and (partial) integrations have been proposed starting from both the partial deduction and abstract interpretation perspectives. In this work we present what we argüe is the first fully described generic algorithm for efñcient and precise integration of abstract interpretation and partial deduction. Taking as starting point state-of-the-art algorithms for context-sensitive, polyvariant abstract interpretation and (abstract) partial deduction, we present an algorithm which combines the best of both worlds. Key ingredients include the accurate success propagation inherent to abstract interpretation and the powerful program transformations achievable by partial deduction. In our algorithm, the calis which appear in the analysis graph are not analyzed w.r.t. the original definition of the procedure but w.r.t. specialized definitions of these procedures. Such specialized definitions are obtained by applying both unfolding and abstract executability. Our framework is parametric w.r.t. different control strategies and abstract domains. Different combinations of such parameters correspond to existing algorithms for program analysis and specialization. Simultaneously, our approach opens the door to the efñcient computation of strictly more precise results than those achievable by each of the individual techniques. The algorithm is now one of the key components of the CiaoPP analysis and specialization system.
Improving the compilation of prolog to C using type and determinism information: Preliminary results
Resumo:
We describe the current status of and provide preliminary performance results for a compiler of Prolog to C. The compiler is novel in that it is designed to accept different kinds of high-level information (typically obtained via an analysis of the initial Prolog program and expressed in a standardized language of assertions) and use this information to optimize the resulting C code, which is then further processed by an off-the-shelf C compiler. The basic translation process used essentially mimics an unfolding of a C-coded bytecode emúlator with respect to the particular bytecode corresponding to the Prolog program. Optimizations are then applied to this unfolded program. This is facilitated by a more flexible design of the bytecode instructions and their lower-level components. This approach allows reusing a sizable amount of the machinery of the bytecode emulator: ancillary pieces of C code, data definitions, memory management routines and áreas, etc., as well as mixing bytecode emulated code with natively compiled code in a relatively straightforward way We report on the performance of programs compiled by the current versión of the system, both with and without analysis information.
Resumo:
This paper discusses some issues which arise in the dataflow analysis of constraint logic programming (CLP) languages. The basic technique applied is that of abstract interpretation. First, some types of optimizations possible in a number of CLP systems (including efficient parallelization) are presented and the information that has to be obtained at compile-time in order to be able to implement such optimizations is considered. Two approaches are then proposed and discussed for obtaining this information for a CLP program: one based on an analysis of a CLP metainterpreter using standard Prolog analysis tools, and a second one based on direct analysis of the CLP program. For the second approach an abstract domain which approximates groundness (also referred to as "definiteness") information (i.e. constraint to a single valué) and the related abstraction functions are presented.
Resumo:
The great developments that have occurred during the last few years in the finite element method and its applications has kept hidden other options for computation. The boundary integral element method now appears as a valid alternative and, in certain cases, has significant advantages. This method deals only with the boundary of the domain, while the F.E.M. analyses the whole domain. This has the following advantages: the dimensions of the problem to be studied are reduced by one, consequently simplifying the system of equations and preparation of input data. It is also possible to analyse infinite domains without discretization errors. These simplifications have the drawbacks of having to solve a full and non-symmetric matrix and some difficulties are incurred in the imposition of boundary conditions when complicated variations of the function over the boundary are assumed. In this paper a practical treatment of these problems, in particular boundary conditions imposition, has been carried out using the computer program shown below. Program SERBA solves general elastostatics problems in 2-dimensional continua using the boundary integral equation method. The boundary of the domain is discretized by line or elements over which the functions are assumed to vary linearly. Data (stresses and/or displacements) are introduced in the local co-ordinate system (element co-ordinates). Resulting stresses are obtained in local co-ordinates and displacements in a general system. The program has been written in Fortran ASCII and implemented on a 1108 Univac Computer. For 100 elements the core requirements are about 40 Kwords. Also available is a Fortran IV version (3 segments)implemented on a 21 MX Hewlett-Packard computer,using 15 Kwords.
Resumo:
At present, in the University curricula in most countries, the decision theory and the mathematical models to aid decision making is not included, as in the graduate program like in Doctored and Master´s programs. In the Technical School of High Level Agronomic Engineers of the Technical University of Madrid (ETSIA-UPM), the need to offer to the future engineers training in a subject that could help them to take decisions in their profession was felt. Along the life, they will have to take a lot of decisions. Ones, will be important and others no. In the personal level, they will have to take several very important decisions, like the election of a career, professional work, or a couple, but in the professional field, the decision making is the main role of the Managers, Politicians and Leaders. They should be decision makers and will be paid for it. Therefore, nobody can understand that such a professional that is called to practice management responsibilities in the companies, does not take training in such an important matter. For it, in the year 2000, it was requested to the University Board to introduce in the curricula an optional qualified subject of the second cycle with 4,5 credits titled " Mathematical Methods for Making Decisions ". A program was elaborated, the didactic material prepared and programs as Maple, Lingo, Math Cad, etc. installed in several IT classrooms, where the course will be taught. In the course 2000-2001 this subject was offered with a great acceptance that exceeded the forecasts of capacity and had to be prepared more classrooms. This course in graduate program took place in the Department of Applied Mathematics to the Agronomic Engineering, as an extension of the credits dedicated to Mathematics in the career of Engineering.
Resumo:
This paper summarizes the work developed in order to establish a framework for seismic retrofitting of bridges. In this context, the first objetive is to find a numerical model to evaluate the damage induced in a structure, under seismic action, as an index of its vulnerability. The model used has the adventage that is based on concepts of fracture mechanics and concentrated plasticity. As a result, the work is based on basic principles. The performance of this model is being evaluated. Some results of the computer program developed for this purpose are shown.
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.
Resumo:
The Space Situational Awareness (SSA) program from the European Space Agency (ESA) protects Europe's citizens and their satellite-based services by detecting space hazards. ESA Ground Systems (GS) division is currently designing a phased array radar composed of thousands of radiating elements for future stages of the SSA program [1]. The radar shall guarantee the detection of most of the Low Earth Orbit (LEO) space debris, providing a general map of space junk. While range accuracy is mainly dictated by the radar waveform, the detection and tracking of small objects in LEO regimes is highly dependent on the angular accuracy achieved by the smart phased array antenna, demonstrating the important of the performance of this architecture.
Resumo:
This paper develops an automatic procedure for the optimal numbering of members and nodes in tree structures. With it the stiffness matrix is optimally conditioned either if a direct solution algorithm or a frontal one is used to solve the system of equations. In spite of its effectiveness, the procedure is strikingly simple and so is the computer program shown below.
Resumo:
The SESAR (Single European Sky ATM Research) program is an ambitious re-search and development initiative to design the future European air traffic man-agement (ATM) system. The study of the behavior of ATM systems using agent-based modeling and simulation tools can help the development of new methods to improve their performance. This paper presents an overview of existing agent-based approaches in air transportation (paying special attention to the challenges that exist for the design of future ATM systems) and, subsequently, describes a new agent-based approach that we proposed in the CASSIOPEIA project, which was developed according to the goals of the SESAR program. In our approach, we use agent models for different ATM stakeholders, and, in contrast to previous work, our solution models new collaborative decision processes for flow traffic management, it uses an intermediate level of abstraction (useful for simulations at larger scales), and was designed to be a practical tool (open and reusable) for the development of different ATM studies. It was successfully applied in three stud-ies related to the design of future ATM systems in Europe.