26 resultados para Trial and error

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uno de los procesos de desarrollo más comunes para llevar a cabo un proyecto arquitectónico es el ensayo y error. Un proceso de selección de pruebas que se suele abordar de dos maneras, o bien se efectúa con el fin de ir depurando una posición más óptima, o bien sirve para explorar nuevas vías de investigación. Con el fin de profundizar en esto, el artículo presenta el análisis de dos diferentes procesos de proyecto de viviendas desarrolladas por ensayo y error, obras referenciales en la historia de la arquitectura, la Villa Stonborough de Wittgenstein y la Villa Moller de Adolf Loos. Ambas aunque pertenecientes al mismo periodo histórico, están desarrolladas de maneras muy opuestas, casi enfrentadas. De su estudio se pretende localizar los conceptos que han impulsado sus diferentes vías de producción, para poder extrapolados a otros casos similares. ABSTRACT: One of the most common processes to develop an architectonic project is the trial and error method. The process of selection of tests is usually done on two different ways. Or it is done with the goal to find out the most optimized position, or it is used to explore new ways of research. In order to investigate this item, the article shows the analysis of two different processes of housing projects that have been done by trial and error. Constructions, that are references in the history of architecture, the Villa Stonborough by Wittgenstein and the Villa Moller by Adolf Loos. Although both of them belong to the same historical period, they are developed by different ways, almost confronted. Thanks to this analysis we will attempt to localize the concepts that drove into their different way of production and then we will try to extrapolate these properties to other similar cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cambios en la presión arterial tras un beta-bloqueante.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The energetic performance of landfill biogas (LB) and biodigester biogas (BB) from municipal waste was examined in consumption tests. These tests were performed in situ at a gas generation plant associated with a landfill facility in Madrid (Spain) and following the standard UNE-EN 30-2-1 (1999). The jets of a domestic cooker commonly used for natural gas (NG) or liquefied petroleum gas (LPG) were modified to operate with the biogases produced at the facility. The working pressures best suited to the tested gases, i.e., to avoid flashback and flame lift, and to ensure the stability and correct functioning of the flame during combustion, were determined by trial and error. Both biogases returned optimum energetic performance for the transfer of heat to water in a metallic recipient (as required by the above standard) at a supply pressure of 10 mbar. Domestic cookers are normally supplied with NG at a pressure of 20 mbar, at which pressure the energetic performance of G20 reference gas was higher than that of both biogases (52.84% compared to 38.06% and 49.77% respectively). Data concerning these issues involving also unexplored feedstock are required for the correct conversions of domestic cookers in order to avoid risks of serious personal injuries or property damages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many macroscopic properties: hardness, corrosion, catalytic activity, etc. are directly related to the surface structure, that is, to the position and chemical identity of the outermost atoms of the material. Current experimental techniques for its determination produce a “signature” from which the structure must be inferred by solving an inverse problem: a solution is proposed, its corresponding signature computed and then compared to the experiment. This is a challenging optimization problem where the search space and the number of local minima grows exponentially with the number of atoms, hence its solution cannot be achieved for arbitrarily large structures. Nowadays, it is solved by using a mixture of human knowledge and local search techniques: an expert proposes a solution that is refined using a local minimizer. If the outcome does not fit the experiment, a new solution must be proposed again. Solving a small surface can take from days to weeks of this trial and error method. Here we describe our ongoing work in its solution. We use an hybrid algorithm that mixes evolutionary techniques with trusted region methods and reuses knowledge gained during the execution to avoid repeated search of structures. Its parallelization produces good results even when not requiring the gathering of the full population, hence it can be used in loosely coupled environments such as grids. With this algorithm, the solution of test cases that previously took weeks of expert time can be automatically solved in a day or two of uniprocessor time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The arrangement of atoms at the surface of a solid accounts for many of its properties: Hardness, chemical activity, corrosion, etc. are dictated by the precise surface structure. Hence, finding it, has a broad range of technical and industrial applications. The ability to solve this problem opens the possibility of designing by computer materials with properties tailored to specific applications. Since the search space grows exponentially with the number of atoms, its solution cannot be achieved for arbitrarily large structures. Presently, a trial and error procedure is used: an expert proposes an structure as a candidate solution and tries a local optimization procedure on it. The solution relaxes to the local minimum in the attractor basin corresponding to the initial point, that might be the one corresponding to the global minimum or not. This procedure is very time consuming and, for reasonably sized surfaces, can take many iterations and much effort from the expert. Here we report on a visualization environment designed to steer this process in an attempt to solve bigger structures and reduce the time needed. The idea is to use an immersive environment to interact with the computation. It has immediate feedback to assess the quality of the proposed structure in order to let the expert explore the space of candidate solutions. The visualization environment is also able to communicate with the de facto local solver used for this problem. The user is then able to send trial structures to the local minimizer and track its progress as they approach the minimum. This allows for simultaneous testing of candidate structures. The system has also proved very useful as an educational tool for the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En los diseños y desarrollos de ingeniería, antes de comenzar la construcción e implementación de los objetivos de un proyecto, es necesario realizar una serie de análisis previos y simulaciones que corroboren las expectativas de la hipótesis inicial, con el fin de obtener una referencia empírica que satisfaga las condiciones de trabajo o funcionamiento de los objetivos de dicho proyecto. A menudo, los resultados que satisfacen las características deseadas se obtienen mediante la iteración de métodos de ensayo y error. Generalmente, éstos métodos utilizan el mismo procedimiento de análisis con la variación de una serie de parámetros que permiten adaptar una tecnología a la finalidad deseada. Hoy en día se dispone de computadoras potentes, así como algoritmos de resolución matemática que permiten resolver de forma veloz y eficiente diferentes tipos de problemas de cálculo. Resulta interesante el desarrollo de aplicaciones que permiten la resolución de éstos problemas de forma rápida y precisa en el análisis y síntesis de soluciones de ingeniería, especialmente cuando se tratan expresiones similares con variaciones de constantes, dado que se pueden desarrollar instrucciones de resolución con la capacidad de inserción de parámetros que definan el problema. Además, mediante la implementación de un código de acuerdo a la base teórica de una tecnología, se puede lograr un código válido para el estudio de cualquier problema relacionado con dicha tecnología. El desarrollo del presente proyecto pretende implementar la primera fase del simulador de dispositivos ópticos Slabsim, en cual se puede representar la distribución de la energía de una onda electromagnética en frecuencias ópticas guiada a través de una una guía dieléctrica plana, también conocida como slab. Este simulador esta constituido por una interfaz gráfica generada con el entorno de desarrollo de interfaces gráficas de usuario Matlab GUIDE, propiedad de Mathworks©, de forma que su manejo resulte sencillo e intuitivo para la ejecución de simulaciones con un bajo conocimiento de la base teórica de este tipo de estructuras por parte del usuario. De este modo se logra que el ingeniero requiera menor intervalo de tiempo para encontrar una solución que satisfaga los requisitos de un proyecto relacionado con las guías dieléctricas planas, e incluso utilizarlo para una amplia diversidad de objetivos basados en esta tecnología. Uno de los principales objetivos de este proyecto es la resolución de la base teórica de las guías slab a partir de métodos numéricos computacionales, cuyos procedimientos son extrapolables a otros problemas matemáticos y ofrecen al autor una contundente base conceptual de los mismos. Por este motivo, las resoluciones de las ecuaciones diferenciales y características que constituyen los problemas de este tipo de estructuras se realizan por estos medios de cálculo en el núcleo de la aplicación, dado que en algunos casos, no existe la alternativa de uso de expresiones analíticas útiles. ABSTRACT. The first step in engineering design and development is an analysis and simulation process which will successfully corroborate the initial hypothesis that was made and find solutions for a particular. In this way, it is possible to obtain empirical evidence which suitably substantiate the purposes of the project. Commonly, the characteristics to reach a particular target are found through iterative trial and error methods. These kinds of methods are based on the same theoretical analysis but with a variation of some parameters, with the objective to adapt the results for a particular aim. At present, powerful computers and mathematical algorithms are available to solve different kinds of calculation problems in a fast and efficient way. Computing application development is useful as it gives a high level of accurate results for engineering analysis and synthesis in short periods of time. This is more notable in cases where the mathematical expressions on a theoretical base are similar but with small variations of constant values. This is due to the ease of adaptation of the computer programming code into a parameter request system that defines a particular solution on each execution. Additionally, it is possible to code an application suitable to simulate any issue related to the studied technology. The aim of the present project consists of the construction of the first stage of an optoelectronics simulator named Slabsim. Slabism is capable of representing the energetic distribution of a light wave guided in the volume of a slab waveguide. The mentioned simulator is made through the graphic user interface development environment Matlab GUIDE, property of Mathworks©. It is designed for an easy and intuitive management by the user to execute simulations with a low knowledge of the technology theoretical bases. With this software it is possible to achieve several aims related to the slab waveguides by the user in low interval of time. One of the main purposes of this project is the mathematical solving of theoretical bases of slab structures through computing numerical analysis. This is due to the capability of adapting its criterion to other mathematical issues and provides a strong knowledge of its process. Based on these advantages, numerical solving methods are used in the core of the simulator to obtain differential and characteristic equations results that become represented on it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este proyecto tiene por objeto definir el diseño y ejecución de voladuras-tipo, que han sido llevadas a cabo en la mina Aguablanca (Badajoz),tanto de contorno como de producción. El diseño teórico de los diferentes parámetros (malla, consumo específico y cantidad de explosivo) necesarios para la ejecución de las voladuras se han llevado a cabo siguiendo la metodología de diferentes manuales sobre perforación y voladura y han sido ajustados a las necesidades reales del proyecto con el fin de mejorar los resultados de fragmentación, desplazamiento, esponjamiento y proyecciones de las voladuras. Los resultados obtenidos inicialmente en los cálculos teóricos de diseño de los distintos tipos de voladura no conducían a resultados óptimos. Para optimizar los resultados se han tenido que modificar algunos de los parámetros anteriormente mencionados. El Técnico debe tener capacidad para aplicar variaciones día a día que permitan mejorar los resultados obtenidos inicialmente en el cálculo teórico. ABSTRACT The project shows the design and implementation of contour and production blasts in the Mine Aguablanca (Badajoz, Spain). The explosive initial design of the blasts, including drilling pattern, powder factor and explosive charging pattern has been done following well known drilling and blasting calculations methods. As the initial theoretical values can lead to non-optimal results, the blast design has been modified by trial and error tests to achieve the desire rock fragmentation, swelling and minimize fly rocks. A good Blasting Technician must be able to adapt and modify, every day if needed, theoretical methodologies in order to cover the mining production necessitie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La materia se presenta ante nosotros en multiplicidad de formas o “apariencias”. A lo largo de la historia se ha reflexionado sobre la relación entre la materia y la forma en distintos campos desde la filosofía hasta la ciencia, pasando por el arte y la arquitectura, moviéndose entre disciplinas desde las más prácticas a las más artísticas, generando posicionamientos opuestos, como el materialismo-idealismo. El concepto de materia a su vez ha ido cambiando a medida que la conciencia humana y la ciencia han ido evolucionando, pasando de considerarse materia como un ente con “masa” a la materia vacía, donde la “masa” es una ilusión que se nos “aparece” dependiendo de la frecuencia con la que vibra su sistema energético. A partir del concepto de “matière” , Josef Albers desarrolla su metodología docente. La matière es más que el aspecto, la “apariencia” que va más allá de la forma cristalizada. Es la forma cambiante que puede adoptar la materia cuando es transformada por el ser humano, dejando su huella sobre ella. Las tres cualidades de la “matière” que el profesor Albers propone en sus ejercicios del Curso Preliminar para desarrollar la “visión” con la “matière” desde la Bauhaus hasta la Universidad de Yale son: Estructural, Factural y Textural. Al desarrollar la observación, teniendo en cuenta estas tres referencias, se descubriá la desvinculación entre lo material y su apariencia desde la honradez. “La discrepancia entre los hechos físicos y el efecto psíquico”. En un proceso constante de ensayo y error se desarrollará la sensibilización individual hacia el material y la evaluación y critica gracias a la dinámica del taller que permite por comparación, aprender y evolucionar como individuo dentro de una sociedad. Esa metodología inductiva regulada por la economía de recursos, promueve el pensamiento creativo, fundamental para producir a través de la articulación un nuevo lenguaje que por medio de la formulación visual exprese nuestra relación con el mundo, con la vida. La vida que constantemente fluye y oscila entre dos polos opuestos, generando interrelaciones que tejen el mundo. Esas interacciones son las que dan vida a la obra artísitica de Albers. PALABRAS CLAVE: materia y matière, estructural factural y textural, vision, hecho físico y efectos psiquico, pensamiento creativo, vida. The matter stands before us in multiple ways or "appearances". Throughout history the relationship between matter and form has been thought out from different fields, from philosophy to science, including art and architecture, moving between disciplines from the most practical to the most artistic generating positions opposites, as materialism-idealism . The concept of matter in turn has changed as the humna consciousness and science have evolved, from being considered as a matter of "mass" to the empty field where the "mass" is an illusion that we "appears" depending on the frequency with which vibrates its energy system. Using the concept of "matière", Josef Albers develops its teaching methodology. The matière is more than the look, the "appearance" that goes beyond the crystallized form. It is the changing form that may take the matter when it is transformed by humans, leaving their mark on it. The three qualities of "matière" that Professor Albers exercises proposed in the Preliminary Course to develop a "vision" with the "matière" from the Bauhaus to Yale are: Structural, Factural and Textural. To develop observation, taking into account these three references, the separation between the material and its appearance was discovered from honesty. "The discrepancy between physical fact and psychic effect." In an ongoing process by trial and error to develop individual sensitzing towards material and critical evaluation through dynamic workshop. The workshop allows for comparison, learn and evolve as an individual within a society. That inductive methodology regulated by the economy of resources, promotes creative thinking, essential to produce through articulation a new language through visual formulation expresses our relationship with the world, with life. Life constantly flowing, oscillates between two opposite poles, creating relationships that weave the world. These interactions are what give life to the artistic work of Albers. KEYWORDS:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Three methodologies to assess As bioaccessibility were evaluated using playgroundsoil collected from 16 playgrounds in Madrid, Spain: two (Simplified Bioaccessibility Extraction Test: SBET, and hydrochloric acid-extraction: HCl) assess gastric-only bioaccessibility and the third (Physiologically Based Extraction Test: PBET) evaluates mouth–gastric–intestinal bioaccessibility. Aqua regia-extractable (pseudo total) As contents, which are routinely employed in riskassessments, were used as the reference to establish the following percentages of bioaccessibility: SBET – 63.1; HCl – 51.8; PBET – 41.6, the highest values associated with the gastric-only extractions. For Madridplaygroundsoils – characterised by a very uniform, weakly alkaline pH, and low Fe oxide and organic matter contents – the statistical analysis of the results indicates that, in contrast with other studies, the highest percentage of As in the samples was bound to carbonates and/or present as calcium arsenate. As opposed to the As bound to Fe oxides, this As is readily released in the gastric environment as the carbonate matrix is decomposed and calcium arsenate is dissolved, but some of it is subsequently sequestered in unavailable forms as the pH is raised to 5.5 to mimic intestinal conditions. The HCl extraction can be used as a simple and reliable (i.e. low residual standard error) proxy for the more expensive, time consuming, and error-prone PBET methodology. The HCl method would essentially halve the estimate of carcinogenic risk for children playing in Madridplaygroundsoils, providing a more representative value of associated risk than the pseudo-total concentrations used at present

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The implementation of abstract machines involves complex decisions regarding, e.g., data representation, opcodes, or instruction specialization levéis, all of which affect the final performance of the emulator and the size of the bytecode programs in ways that are often difficult to foresee. Besides, studying alternatives by implementing abstract machine variants is a time-consuming and error-prone task because of the level of complexity and optimization of competitive implementations, which makes them generally difficult to understand, maintain, and modify. This also makes it hard to genérate specific implementations for particular purposes. To ameliorate those problems, we propose a systematic approach to the automatic generation of implementations of abstract machines. Different parts of their definition (e.g., the instruction set or the infernal data and bytecode representation) are kept sepárate and automatically assembled in the generation process. Alternative versions of the abstract machine are therefore easier to produce, and variants of their implementation can be created mechanically, with specific characteristics for a particular application if necessary. We illustrate the practicality of the approach by reporting on an implementation of a generator of production-quality WAMs which are specialized for executing a particular fixed (set of) program(s). The experimental results show that the approach is effective in reducing emulator size.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A range of methodologies and techniques are available to guide the design and implementation of language extensions and domainspecific languages. A simple yet powerful technique is based on source-tosource transformations interleaved across the compilation passes of a base language. Despite being a successful approach, it has the main drawback that the input source code is lost in the process. When considering the whole workflow of program development (warning and error reporting, debugging, or even program analysis), program translations are no more powerful than a glorified macro language. In this paper, we propose an augmented approach to language extensions for Prolog, where symbolic annotations are included in the target program. These annotations allow selectively reversing the translated code. We illustrate the approach by showing that coupling it with minimal extensions to a generic Prolog debugger allows us to provide users with a familiar, source-level view during the debugging of programs which use a variety of language extensions, such as functional notation, DCGs, or CLP{Q,R}.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ATM, SDH or satellite have been used in the last century as the contribution network of Broadcasters. However the attractive price of IP networks is changing the infrastructure of these networks in the last decade. Nowadays, IP networks are widely used, but their characteristics do not offer the level of performance required to carry high quality video under certain circumstances. Data transmission is always subject to errors on line. In the case of streaming, correction is attempted at destination, while on transfer of files, retransmissions of information are conducted and a reliable copy of the file is obtained. In the latter case, reception time is penalized because of the low priority this type of traffic on the networks usually has. While in streaming, image quality is adapted to line speed, and line errors result in a decrease of quality at destination, in the file copy the difference between coding speed vs line speed and errors in transmission are reflected in an increase of transmission time. The way news or audiovisual programs are transferred from a remote office to the production centre depends on the time window and the type of line available; in many cases, it must be done in real time (streaming), with the resulting image degradation. The main purpose of this work is the workflow optimization and the image quality maximization, for that reason a transmission model for multimedia files adapted to JPEG2000, is described based on the combination of advantages of file transmission and those of streaming transmission, putting aside the disadvantages that these models have. The method is based on two patents and consists of the safe transfer of the headers and data considered to be vital for reproduction. Aside, the rest of the data is sent by streaming, being able to carry out recuperation operations and error concealment. Using this model, image quality is maximized according to the time window. In this paper, we will first give a briefest overview of the broadcasters requirements and the solutions with IP networks. We will then focus on a different solution for video file transfer. We will take the example of a broadcast center with mobile units (unidirectional video link) and regional headends (bidirectional link), and we will also present a video file transfer file method that satisfies the broadcaster requirements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a fuzzy based Variable Structure Control (VSC) with guaranteed stability is presented. The main objective is to obtain an improved performance of highly non-linear unstable systems. The main contribution of this work is that, firstly, new functions for chattering reduction and error convergence without sacrificing invariant properties are proposed, which is considered the main drawback of the VSC control. Secondly, the global stability of the controlled system is guaranteed.The well known weighting parameters approach, is used in this paper to optimize local and global approximation and modeling capability of T-S fuzzy model.A one link robot is chosen as a nonlinear unstable system to evaluate the robustness, effectiveness and remarkable performance of optimization approach and the high accuracy obtained in approximating nonlinear systems in comparison with the original T-S model. Simulation results indicate the potential and generality of the algorithm. The application of the proposed FLC-VSC shows that both alleviation of chattering and robust performance are achieved with the proposed FLC-VSC controller. The effectiveness of the proposed controller is proven in front of disturbances and noise effects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

En la presente investigación se analiza la causa del hundimiento del cuarto compartimento del Tercer Depósito del Canal de Isabel II el 8 de abril de 1905, uno de los más graves de la historia de la construcción en España: fallecieron 30 personas y quedaron heridas otras 60. El Proyecto y Construcción de esta estructura era de D. José Eugenio Ribera, una de las grandes figuras de la ingeniería civil en nuestro país, cuya carrera pudo haber quedado truncada como consecuencia del siniestro. Dado el tiempo transcurrido desde la ocurrencia de este accidente, la investigación ha partido de la recopilación de la información relativa al Proyecto y a la propia construcción de la estructura, para revisar a continuación la información disponible sobre el hundimiento. De la construcción de la cubierta es interesante destacar la atrevida configuración estructural, cubriéndose una inmensa superficie de 74.000 m2 mediante una sucesión de bóvedas de hormigón armado de tan sólo 5 cm de espesor y un rebajamiento de 1/10 para salvar una luz de 6 m, que apoyaban en pórticos del mismo material, con pilares también muy esbeltos: 0,25 m de lado para 8 m de altura. Y todo ello en una época en la que la tecnología y conocimiento de las estructuras con este "nuevo" material se basaban en buena medida en el desarrollo de patentes. En cuanto a la información sobre el hundimiento, llama la atención en primer lugar la relevancia de los técnicos, peritos y letrados que intervinieron en el juicio y en el procedimiento administrativo posterior, poniéndose de manifiesto la trascendencia que el accidente tuvo en su momento y que, sin embargo, no ha trascendido hasta nuestros días. Ejemplo de ello es el papel de Echegaray -primera figura intelectual de la época- como perito en la defensa de Ribera, de D. Melquiades Álvarez -futuro presidente del Congreso- como abogado defensor, el General Marvá -uno de los máximos exponentes del papel de los ingenieros militares en la introducción del hormigón armado en nuestro país-, que presidiría la Comisión encargada del peritaje por parte del juzgado, o las opiniones de reconocidas personalidades internacionales del "nuevo" material como el Dr. von Emperger o Hennebique. Pero lo más relevante de dicha información es la falta de uniformidad sobre lo que pudo ocasionar el hundimiento: fallos en los materiales, durante la construcción, defectos en el diseño de la estructura, la realización de unas pruebas de carga cuando se concluyó ésta, etc. Pero la que durante el juicio y en los Informes posteriores se impuso como causa del fallo de la estructura fue su dilatación como consecuencia de las altas temperaturas que se produjeron aquella primavera. Y ello a pesar de que el hundimiento ocurrió a las 7 de la mañana... Con base en esta información se ha analizado el comportamiento estructural de la cubierta, permitiendo evaluar el papel que diversos factores pudieron tener en el inicio del hundimiento y en su extensión a toda la superficie construida, concluyéndose así cuáles fueron las causas del siniestro. De los resultados obtenidos se presta especial atención a las enseñanzas que se desprenden de la ocurrencia del hundimiento, enfatizándose en la relevancia de la historia -y en particular de los casos históricos de error- para la formación continua que debe existir en la Ingeniería. En el caso del hundimiento del Tercer Depósito algunas de estas "enseñanzas" son de plena actualidad, tales como la importancia de los detalles constructivos en la "robustez" de la estructuras, el diseño de estructuras "integrales" o la vigilancia del proceso constructivo. Por último, la investigación ha servido para recuperar, una vez más, la figura de D. José Eugenio Ribera, cuyo papel en la introducción del hormigón armado en España fue decisivo. En la obra del Tercer Depósito se arriesgó demasiado, y provocó un desastre que aceleró la transición hacia una nueva etapa en el hormigón estructural al abrigo de un mayor conocimiento científico y de las primeras normativas. También en esta etapa sería protagonista. This dissertation analyses the cause of the collapse of the 4th compartment of the 3th Reservoir of Canal de Isabel II in Madrid. It happened in 1905, on April 8th, being one of the most disastrous accidents occurred in the history of Spanish construction: 30 people died and 60 were injured. The design and construction supervision were carried out by D. José Eugenio Ribera, one of the main figures in Civil Engineering of our country, whose career could have been destroyed as a result of this accident. Since it occurred more than 100 years ago, the investigation started by compiling information about the structure`s design and construction, followed by reviewing the available information about the accident. With regard to the construction, it is interesting to point out its daring structural configuration. It covered a huge area of 74.000 m2 with a series of reinforced concrete vaults with a thickness of not more than 5 cm, a 6 m span and a rise of 1/10th. In turn, these vaults were supported by frames composed of very slender 0,25 m x 0,25 m columns with a height of 8 m. It is noteworthy that this took place in a time when the technology and knowledge about this "new" material was largely based on patents. In relation to the information about the collapse, its significance is shown by the important experts and lawyers that were involved in the trial and the subsequent administrative procedure. For example, Echegaray -the most important intellectual of that time- defended Ribera, Melquiades Álvarez –the future president of the Congress- was his lawyer, and General Marvá -who represented the important role of the military engineers in the introduction of reinforced concrete in our country-, led the Commission that was put in charge by the judge of the root cause analysis. In addition, the matter caught the interest of renowned foreigners like Dr. von Emperger or Hennebique and their opinions had a great influence. Nonetheless, this structural failure is unknown to most of today’s engineers. However, what is most surprising are the different causes that were claimed to lie at the root of the disaster: material defects, construction flaws, errors in the design, load tests performed after the structure was finished, etc. The final cause that was put forth during the trial and in the following reports was attributed to the dilatation of the roof due to the high temperatures that spring, albeit the collapse occurred at 7 AM... Based on this information the structural behaviour of the roof has been analysed, which allowed identifying the causes that could have provoked the initial failure and those that could have led to the global collapse. Lessons have been learned from these results, which points out the relevance of history -and in particular, of examples gone wrong- for the continuous education that should exist in engineering. In the case of the 3th Reservoir some of these lessons are still relevant during the present time, like the importance of detailing in "robustness", the design of "integral" structures or the due consideration of construction methods. Finally, the investigation has revived, once again, the figure of D. José Eugenio Ribera, whose role in the introduction of reinforced concrete in Spain was crucial. With the construction of the 3th Reservoir he took too much risk and caused a disaster that accelerated the transition to a new era in structural concrete based on greater scientific knowledge and the first codes. In this new period he would also play a major role.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta Tesis Doctoral se encuadra en el ámbito de la medida de emisiones contaminantes y de consumo de combustible en motores de combustión interna alternativos cuando se utilizan como plantas de potencia para propulsión de vehículos ligeros de carretera, y más concretamente en las medidas dinámicas con el vehículo circulando en tráfico real. En este ámbito, el objetivo principal de la Tesis es estudiar los problemas asociados a la medición en tiempo real con equipos embarcados de variables medioambientales, energéticas y de actividad, de vehículos ligeros propulsados por motores térmicos en tráfico real. Y como consecuencia, desarrollar un equipo y una metodología apropiada para este objetivo, con el fin de realizar consiguientemente un estudio sobre los diferentes factores que influyen sobre las emisiones y el consumo de combustible de vehículos turismo en tráfico real. La Tesis se comienza realizando un estudio prospectivo sobre los trabajos de otros autores relativos al desarrollo de equipos portátiles de medida de emisiones (Portable Emission Measurement Systems – PEMS), problemas asociados a la medición dinámica de emisiones y estudios de aplicación en tráfico real utilizando este tipo de equipos. Como resultado de este estudio se plantea la necesidad de disponer de un equipo específicamente diseñado para ser embarcado en un vehículo que sea capaz de medir en tiempo real las concentraciones de emisiones y el caudal de gases de escape, al mismo tiempo que se registran variables del motor, del vehículo y del entorno como son la pendiente y los datos meteorológicos. De esta forma se establecen las especificaciones y condiciones de diseño del equipo PEMS. Aunque al inicio de esta Tesis ya existían en el mercado algunos sistemas portátiles de medida de emisiones (PEMS: Portable Emissions Measurement Systems), en esta Tesis se investiga, diseña y construye un nuevo sistema propio, denominado MIVECO – PEMS. Se exponen, discuten y justifican todas las soluciones técnicas incorporadas en el sistema que incluyen los subsistema de análisis de gases, subsistemas de toma de muestra incluyendo caudalímetro de gases de escape, el subsistema de medida de variables del entorno y actividad del vehículo y el conjunto de sistemas auxiliares. El diseño final responde a las hipótesis y necesidades planteadas y se valida en uso real, en banco de rodillos y en comparación con otro equipos de medida de emisiones estacionarios y portátiles. En esta Tesis se presenta también toda la investigación que ha conducido a establecer la metodología de tratamiento de las señales registradas en tiempo real que incluye la sincronización, cálculos y propagación de errores. La metodología de selección y caracterización de los recorridos y circuitos y de las pautas de conducción, preparación del vehículo y calibración de los equipos forma también parte del legado de esta Tesis. Para demostrar la capacidad de medida del equipo y el tipo de resultados que pueden obtenerse y que son útiles para la comunidad científica, y las autoridades medioambientales en la parte final de esta Tesis se plantean y se presentan los resultados de varios estudios de variables endógenas y exógenas que afectan a las emisiones instantáneas y a los factores de emisión y consumo (g/km) como: el estilo de conducción, la infraestructura vial, el nivel de congestión del tráfico, tráfico urbano o extraurbano, el contenido de biocarburante, tipo de motor (diesel y encendido provocado), etc. Las principales conclusiones de esta Tesis son que es posible medir emisiones másicas y consumo de motores de vehículos en uso real y que los resultados permiten establecer políticas de reducción de impacto medio ambiental y de eficiencia energética, pero, se deben establecer unas metodologías precisas y se debe tener mucho cuidado en todo el proceso de calibración, medida y postratamientos de los datos. Abstract This doctoral thesis is in the field of emissions and fuel consumption measurement of reciprocating internal combustion engines when are used as power-trains for light-duty road vehicles, and especially in the real-time dynamic measurements procedures when the vehicle is being driven in real traffic. In this context, the main objective of this thesis is to study the problems associated with on-board real-time measuring systems of environmental, energy and activity variables of light vehicles powered by internal combustion engines in real traffic, and as a result, to develop an instrument and an appropriate methodology for this purpose, and consequently to make a study of the different factors which influence the emissions and the fuel consumption of passenger cars in real traffic. The thesis begins developing a prospective study on other authors’ works about development of Portable Emission Measurement Systems (PEMS), problems associated with dynamic emission measurements and application studies on actual traffic using PEMS. As a result of this study, it was shown that a measuring system specifically designed for being on-board on a vehicle, which can measure in real time emission concentrations and exhaust flow, and at the same time to record motor vehicle and environment variables as the slope and atmospheric data, is needed; and the specifications and design parameters of the equipment are proposed. Although at the beginning of this research work there were already on the market some PEMS, in this Thesis a new system is researched, designed and built, called MIVECO – PEMS, in order to meet such measurements needs. Following that, there are presented, discussed and justify all technical solutions incorporated in the system, including the gas analysis subsystem, sampling and exhaust gas flowmeter subsystem, the subsystem for measurement of environment variables and of the vehicle activity and the set of auxiliary subsystems. The final design meets the needs and hypotheses proposed, and is validated in real-life use and chassis dynamometer testing and is also compared with other stationary and on-board systems. This thesis also presents all the research that has led to the methodology of processing the set of signals recorded in real time including signal timing, calculations and error propagation. The methodology to select and characterize of the routes and circuits, the driving patterns, and the vehicle preparation and calibration of the instruments and sensors are part of the legacy of this thesis. To demonstrate the measurement capabilities of the system and the type of results that can be obtained and that are useful for the scientific community and the environmental authorities, at the end of this Thesis is presented the results of several studies of endogenous and exogenous variables that affect the instantaneous and averaged emissions and consumption factors (g/km), as: driving style, road infrastructure, the level of traffic congestion, urban and extra-urban traffic, biofuels content, type of engine (diesel or spark ignition) etc. The main conclusions of this thesis are that it is possible to measure mass emissions and consumption of vehicle engines in actual use and that the results allow us to establish policies to reduce environmental impact and improve energy efficiency, but, to establish precise methodologies and to be very careful in the entire process of calibration, measurement and data post-treatment is necessary.