33 resultados para ALGOL (Computer program language)
Resumo:
The analysis of modes and natural frequencies is of primary interest in the computation of the response of bridges. In this article the transfer matrix method is applied to this problem to provide a computer code to calculate the natural frequencies and modes of bridge-like structures. The Fortran computer code is suitable for running on small computers and results are presented for a railway bridge.
Resumo:
This paper summarizes the work developed in order to establish a framework for seismic retrofitting of bridges. In this context, the first objetive is to find a numerical model to evaluate the damage induced in a structure, under seismic action, as an index of its vulnerability. The model used has the adventage that is based on concepts of fracture mechanics and concentrated plasticity. As a result, the work is based on basic principles. The performance of this model is being evaluated. Some results of the computer program developed for this purpose are shown.
Resumo:
This paper describes a preprocessing module for improving the performance of a Spanish into Spanish Sign Language (Lengua de Signos Espanola: LSE) translation system when dealing with sparse training data. This preprocessing module replaces Spanish words with associated tags. The list with Spanish words (vocabulary) and associated tags used by this module is computed automatically considering those signs that show the highest probability of being the translation of every Spanish word. This automatic tag extraction has been compared to a manual strategy achieving almost the same improvement. In this analysis, several alternatives for dealing with non-relevant words have been studied. Non-relevant words are Spanish words not assigned to any sign. The preprocessing module has been incorporated into two well-known statistical translation architectures: a phrase-based system and a Statistical Finite State Transducer (SFST). This system has been developed for a specific application domain: the renewal of Identity Documents and Driver's License. In order to evaluate the system a parallel corpus made up of 4080 Spanish sentences and their LSE translation has been used. The evaluation results revealed a significant performance improvement when including this preprocessing module. In the phrase-based system, the proposed module has given rise to an increase in BLEU (Bilingual Evaluation Understudy) from 73.8% to 81.0% and an increase in the human evaluation score from 0.64 to 0.83. In the case of SFST, BLEU increased from 70.6% to 78.4% and the human evaluation score from 0.65 to 0.82.
Resumo:
The great developments that have occurred during the last few years in the finite element method and its applications has kept hidden other options for computation. The boundary integral element method now appears as a valid alternative and, in certain cases, has significant advantages. This method deals only with the boundary of the domain, while the F.E.M. analyses the whole domain. This has the following advantages: the dimensions of the problem to be studied are reduced by one, consequently simplifying the system of equations and preparation of input data. It is also possible to analyse infinite domains without discretization errors. These simplifications have the drawbacks of having to solve a full and non-symmetric matrix and some difficulties are incurred in the imposition of boundary conditions when complicated variations of the function over the boundary are assumed. In this paper a practical treatment of these problems, in particular boundary conditions imposition, has been carried out using the computer program shown below. Program SERBA solves general elastostatics problems in 2-dimensional continua using the boundary integral equation method. The boundary of the domain is discretized by line or elements over which the functions are assumed to vary linearly. Data (stresses and/or displacements) are introduced in the local co-ordinate system (element co-ordinates). Resulting stresses are obtained in local co-ordinates and displacements in a general system. The program has been written in Fortran ASCII and implemented on a 1108 Univac Computer. For 100 elements the core requirements are about 40 Kwords. Also available is a Fortran IV version (3 segments)implemented on a 21 MX Hewlett-Packard computer,using 15 Kwords.
Resumo:
ImageJ es un programa informático de tratamiento digital de imagen orientado principalmente hacia el ámbito de las ciencias de la salud. Se trata de un software de dominio público y de código abierto desarrollado en lenguaje Java en las instituciones del National Institutes of Health de Estados Unidos. Incluye por defecto potentes herramientas para editar, procesar y analizar imágenes de casi cualquier tipo y formato. Sin embargo, su mayor virtud reside en su extensibilidad: las funcionalidades de ImageJ pueden ampliarse hasta resolver casi cualquier problema de tratamiento digital de imagen mediante macros, scripts y, especialmente, plugins programables en lenguaje Java gracias a la API que ofrece. Además, ImageJ cuenta con repositorios oficiales en los que es posible obtener de forma gratuita macros, scripts y plugins aplicables en multitud de entornos gracias a la labor de la extensa comunidad de desarrolladores de ImageJ, que los depura, mejora y amplia frecuentemente. Este documento es la memoria de un proyecto que consiste en el análisis detallado de las herramientas de tratamiento digital de imagen que ofrece ImageJ. Tiene por objetivo determinar si ImageJ, a pesar de estar más enfocado a las ciencias de la salud, puede resultar útil en el entorno de la Escuela Técnica Superior de Ingeniería y Sistemas de Telecomunicación de la Universidad Politécnica de Madrid, y en tal caso, resaltar las características que pudieran resultar más beneficiosas en este ámbito y servir además como guía introductoria. En las siguientes páginas se examinan una a una las herramientas de ImageJ (versión 1.48q), su funcionamiento y los mecanismos subyacentes. Se sigue el orden marcado por los menús de la interfaz de usuario: el primer capítulo abarca las herramientas destinadas a la manipulación de imágenes en general (menú Image); el segundo, las herramientas de procesado (menú Process); el tercero, las herramientas de análisis (menú Analyze); y el cuarto y último, las herramientas relacionadas con la extensibilidad de ImageJ (menú Plugins). ABSTRACT. ImageJ is a digital image processing computer program which is mainly focused at the health sciences field. It is a public domain, open source software developed in Java language at the National Institutes of Health of the United States of America. It includes powerful built-in tools to edit, process and analyze almost every type of image in nearly every format. However, its main virtue is its extensibility: ImageJ functionalities can be widened to solve nearly every situation found in digital image processing through macros, scripts and, specially, plugins programmed in Java language thanks to the ImageJ API. In addition, ImageJ has official repositories where it is possible to freely get many different macros, scripts and plugins thanks to the work carried out by the ImageJ developers community, which continuously debug, improve and widen them. This document is a report which explains a detailed analysis of all the digital image processing tools offered by ImageJ. Its final goal is to determine if ImageJ can be useful to the environment of Escuela Tecnica Superior de Ingenierfa y Sistemas de Telecomunicacion of Universidad Politecnica de Madrid, in spite of being focused at the health sciences field. In such a case, it also aims to highlight the characteristics which could be more beneficial in this field, and serve as an introductory guide too. In the following pages, all of the ImageJ tools (version 1.48q) are examined one by one, as well as their work and the underlying mechanics. The document follows the order established by the menus in ImageJ: the first chapter covers all the tools destined to manipulate images in general (menu Image); the second one covers all the processing tools (menu Process); the third one includes analyzing tools (menu Analyze); and finally, the fourth one contains all those tools related to ImageJ extensibility (menu Plugins).
Resumo:
Programa informático desarrollado en plataforma EXCEL (VBA) y dirigido al diseño de Separadores de dos y tres fases, verticales y horizontales. El programa de ordenador o aplicación tiene la capacidad de determinar las propiedades físicas del fluido, utilizando diferentes correlaciones sobre la base del “Black Oil Model”, con dichas propiedades el Programa predice el tipo de flujo presente. Si el tipo de flujo es “Slug Flow” el programa determinara las dimensiones del “Slug catcher” necesario. Bajo las condiciones de funcionamiento existentes el programa diseñará el separador elegido: dos o tres fases, vertical u horizontal. Por último, la aplicación informática estimará el coste del equipo. Abstract Computer program developed in EXCEL (VBA) platform and aimed for the design of Two-Phase, Three-Phase, Vertical or Horizontal Separators. The computer Program or Application has the capability to determine the fluid physical properties utilizing different correlations on the basis of the Black Oil Model, with those Properties the Program will predict the Flow Regime present. If the flow regime is Slug Flow the program will determine the necessary slug catcher dimensions. Under certain operational conditions the program will design the selected: Two-Phase or Three-Phase, Vertical or Horizontal Separator. Finally the computer Application will estimate the cost of the equipment.
Resumo:
This paper presents a Levy-type solution for the natural frequencies of translational shells. A computer program in FORTRAN IV language corresponding to this solution is described. This direct solution is compared with some indirect solutions utilising Galerkin and Rayleigh methods. An extension to the study of forced vibrations is outlined.
Resumo:
Light detection and ranging (LiDAR) technology is beginning to have an impact on agriculture. Canopy volume and/or fruit tree leaf area can be estimated using terrestrial laser sensors based on this technology. However, the use of these devices may have different options depending on the resolution and scanning mode. As a consequence, data accuracy and LiDAR derived parameters are affected by sensor configuration, and may vary according to vegetative characteristics of tree crops. Given this scenario, users and suppliers of these devices need to know how to use the sensor in each case. This paper presents a computer program to determine the best configuration, allowing simulation and evaluation of different LiDAR configurations in various tree structures (or training systems). The ultimate goal is to optimise the use of laser scanners in field operations. The software presented generates a virtual orchard, and then allows the scanning simulation with a laser sensor. Trees are created using a hidden Markov tree (HMT) model. Varying the foliar structure of the orchard the LiDAR simulation was applied to twenty different artificially created orchards with or without leaves from two positions (lateral and zenith). To validate the laser sensor configuration, leaf surface of simulated trees was compared with the parameters obtained by LiDAR measurements: the impacted leaf area, the impacted total area (leaves and wood), and th impacted area in the three outer layers of leaves.
Resumo:
La presente tesis estudia las realizaciones del arquitecto Emilio Pérez Pinero, todas dentro de las estructuras espaciales de barras desmontables y desplegables, elabora la documentación que hace transmisible su investigación y generaliza el estudio del comportamiento en la parcela de las desplegables. La obra de este arquitecto forma un conjunto original, atractivo y sin continuadores, y por otra parte, no abundan las" investigaciones sobre este tipo de estructuras ( mucho menos las realizaciones), en las que hay que resolver tanto su definición como su movilidad y comportamiento estructural. El contenido de la parte correspondiente a las estructuras desmontables se limita a las cúpulas reticuladas de una capa, con el sistema de reticulado y montaje ideado por Pinero, por considerar que se debe documentar su aportación pero no incidir mas en un campo de investigación que cuenta con abundantes estudios. Se aporta la solución matemática y un programa de ordenador para la definición geométrica completa del reticulado empleado. Las estructuras desplegables se caracterizan por el empleo de barras dispuestas en "x" en el espesor de la estructura, con generación de superficies tanto planas como curvas. En ambos casos se analiza la movilidad en fase de mecanismo, tanto a las soluciones de Pinero como a las complementarlas que se exponen. Se estudian las relaciones geométricas que deben de cumplirse para que sea posible el movimiento de las barras, relaciones particularmente complejas en las desplegables según superficies esféricas, y que determinan su definición geométrica. En la fase de estructura, además de analizar lo realizado por Pinero, documentando y definiendo sus componentes, se proponen varias estructuras posibles para cada mecanismo, y se desarrolla en detalle el tipo de los emparrillados de canto constante, donde se incluye un estudio comparativo de nueve variantes distintas. Se muestra el amplio campo de uso posible para estas estructuras. ABSTRACT The • present doctoral dissertation studies the work of de spanish architect Emilio Pérez Pinero, all of it within de field of spatial demountable and deployable structures. This contribution compiles the necessary documentation for research in this field and, besides, generalizes the theoretical background for the analysis of this type of structures. Pérez Pinero's contributions are original and attractive, but, so far, he has not any followers ; on the other hand research in this field is scarce (much less actual realizations). In the part corresponding to demountable structures the research is limited to reticulated domes of only one layer, following Pérez Pinero's sys~ tem, trying to give a comprehensive documentation of it. The mathematical solution is given and so is a computer program for the complete definition of the geometry of the structure. One characteristic of deployable structures is the use of struts placed - formix "X" in the thickness of the structure, making possible the generation of plañe as well as curved surfaces. In both cases, the operation in the phase of mechanism is studied, both fot Pinero's solution and for the other schemes presented. The geometrical relationships that must be maintained in order to guarantee strut's movements, are studied; these relationships are particularly complex in the case of spherical surfaces, and, in this last casey determine completely its geometrical definition. In regard of the structure behaviour, besides analysing Pinero's works, a variety of solutions are proposed for each mechanism. Particularly, the configuration for double layer grids of constant thickness is developed with great detall, and a comparative study of nine different solutions of this special case is included. A wide range of the possible applications of this structural type is shown.
Resumo:
Corrosion of reinforcing steel in concrete due to chloride ingress is one of the main causes of the deterioration of reinforced concrete structures. Structures most affected by such a corrosion are marine zone buildings and structures exposed to de-icing salts like highways and bridges. Such process is accompanied by an increase in volume of the corrosión products on the rebarsconcrete interface. Depending on the level of oxidation, iron can expand as much as six times its original volume. This increase in volume exerts tensile stresses in the surrounding concrete which result in cracking and spalling of the concrete cover if the concrete tensile strength is exceeded. The mechanism by which steel embedded in concrete corrodes in presence of chloride is the local breakdown of the passive layer formed in the highly alkaline condition of the concrete. It is assumed that corrosion initiates when a critical chloride content reaches the rebar surface. The mathematical formulation idealized the corrosion sequence as a two-stage process: an initiation stage, during which chloride ions penetrate to the reinforcing steel surface and depassivate it, and a propagation stage, in which active corrosion takes place until cracking of the concrete cover has occurred. The aim of this research is to develop computer tools to evaluate the duration of the service life of reinforced concrete structures, considering both the initiation and propagation periods. Such tools must offer a friendly interface to facilitate its use by the researchers even though their background is not in numerical simulation. For the evaluation of the initiation period different tools have been developed: Program TavProbabilidade: provides means to carry out a probability analysis of a chloride ingress model. Such a tool is necessary due to the lack of data and general uncertainties associated with the phenomenon of the chloride diffusion. It differs from the deterministic approach because it computes not just a chloride profile at a certain age, but a range of chloride profiles for each probability or occurrence. Program TavProbabilidade_Fiabilidade: carries out reliability analyses of the initiation period. It takes into account the critical value of the chloride concentration on the steel that causes breakdown of the passive layer and the beginning of the propagation stage. It differs from the deterministic analysis in that it does not predict if the corrosion is going to begin or not, but to quantifies the probability of corrosion initiation. Program TavDif_1D: was created to do a one dimension deterministic analysis of the chloride diffusion process by the finite element method (FEM) which numerically solves Fick’second Law. Despite of the different FEM solver already developed in one dimension, the decision to create a new code (TavDif_1D) was taken because of the need to have a solver with friendly interface for pre- and post-process according to the need of IETCC. An innovative tool was also developed with a systematic method devised to compare the ability of the different 1D models to predict the actual evolution of chloride ingress based on experimental measurements, and also to quantify the degree of agreement of the models with each others. For the evaluation of the entire service life of the structure: a computer program has been developed using finite elements method to do the coupling of both service life periods: initiation and propagation. The program for 2D (TavDif_2D) allows the complementary use of two external programs in a unique friendly interface: • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. This program (TavDif_2D) is responsible to decide in each time step when and where to start applying the boundary conditions of fracture mechanics module in function of the amount of chloride concentration and corrosion parameters (Icorr, etc). This program is also responsible to verify the presence and the degree of fracture in each element to send the Information of diffusion coefficient variation with the crack width. • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. The advantages of the FEM with the interface provided by the tool are: • the flexibility to input the data such as material property and boundary conditions as time dependent function. • the flexibility to predict the chloride concentration profile for different geometries. • the possibility to couple chloride diffusion (initiation stage) with chemical and mechanical behavior (propagation stage). The OOFEM code had to be modified to accept temperature, humidity and the time dependent values for the material properties, which is necessary to adequately describe the environmental variations. A 3-D simulation has been performed to simulate the behavior of the beam on both, action of the external load and the internal load caused by the corrosion products, using elements of imbedded fracture in order to plot the curve of the deflection of the central region of the beam versus the external load to compare with the experimental data.
Resumo:
This paper presents the implementation of an adaptive philosophy to plane potential problems, using the direct boundary element method. After some considerations about the state of the art and a discussion of the standard approach features, the possibility of separately treating the modelling of variables and their interpolation through hierarchical shape functions is analysed. Then the proposed indicators and estimators are given, followed by a description of a small computer program written for an IBM PC. Finally, some examples show the kind of results to be expected.
Resumo:
This paper presents a computer program developed to run in a micro I.B.M.-P.C. wich incorporates some features in order to optimize the number of operations needed to compute the solution of plane potential problems governed by Laplace's equation by using the Boundary Integral Equation Method (B.I.E.M.). Also incorporated is a routine to plot isolines inside the domain under study.
Resumo:
This paper presents some of the modelling criteria that have been used for the study of pyrotechnic shock propagation in the A5 VEB Structure, as well as the main conclusions from a mathematical model of the axymmetric effects in it. The separation of the lower stage of the ARIANE 5 Vehicle Equipment Bay (VEB)Structure is to be done using a pyrotechnic device. The wave propagation effects produced by the explosion have been analyzed with a computer program using as shape functions the analytical solution to the frequency response of a Timoshenko-Rayleigh beams and shells in that way the discretization can have elements as large as possible, depending on the material properties and boundary conditions. Moreover an enormous amount of possibilities in the treatment of concentrated masses, springs and dashpots, either with respect to a fixed reference or between nodes, is open for translational as well as rotational degrees of freedom.
Resumo:
Este trabajo estudia la aportación que los métodos de agregación de juicios de expertos pueden realizar en el cálculo de la peligrosidad sísmica de emplazamientos. Se han realizado cálculos en dos emplazamientos de la Península Ibérica: Mugardos (La Coruña) y Cofrentes (Valencia) que están sometidos a regímenes tectónicos distintos y que, además, alojan instalaciones industriales de gran responsabilidad. Las zonas de estudio, de 320 Km de radio, son independientes. Se ha aplicado un planteamiento probabilista a la estimación de la tasa anual de superación de valores de la aceleración horizontal de pico y se ha utilizado el Método de Montecarlo para incorporar a los resultados la incertidumbre presente en los datos relativos a la definición de cada fuente sismogenética y de su sismicidad. Los cálculos se han operado mediante un programa de ordenador, desarrollado para este trabajo, que utiliza la metodología propuesta por el Senior Seismic Hazard Analysis Commitee (1997) para la NRC. La primera conclusión de los resultados ha sido que la Atenuación es la fuente principal de incertidumbre en las estimaciones de peligrosidad en ambos casos. Dada la dificultad de completar los datos históricos disponibles de esta variable se ha estudiado el comportamiento de cuatro métodos matemáticos de agregación de juicios de expertos a la hora de estimar una ley de atenuación en un emplazamiento. Los datos de partida se han obtenido del Catálogo de Isosistas del IGN. Los sismos utilizados como variables raíz se han elegido con el criterio de cubrir uniformemente la serie histórica disponible y los valores de magnitud observados. Se ha asignado un panel de expertos particular a cada uno de los dos emplazamientos y se han aplicado a sus juicios los métodos de Cooke, equipesos, Apostolakis_Mosleh y Morris. Sus propuestas se han comparado con los datos reales para juzgar su eficacia y su facilidad de operación. A partir de los resultados se ha concluido que el método de Cooke ha mostrado el comportamiento más eficiente y robusto para ambos emplazamientos. Este método, además, ha permitido identificar, razonadamente, a aquellos expertos que no deberían haberse introducido en un panel. The present work analyses the possible contribution of the mathematical methods of aggregation in the assessment of Seismic Hazzard. Two sites, in the Iberian Peninsula, have been considered: Mugardos ( La Coruña) and Cofrentes (Valencia).Both of them are subjected to different tectonic regimes an both accommodate high value industrial plants. Their areas of concern, with radius of 320 Km, are not overlapping. A probabilistic approach has been applied in the assessment the annual probability of exceedence of the horizontal peak acceleration. The Montecarlo Method has allowed to transfer the uncertainty in the models and parameters to the final results. A computer program has been developed for this purpose. The methodology proposed by the Senior Seismic Analysis Committee (1997) for the NRC has been considered. Attenuation in Ground motion has been proved to be the main source of uncertainty in seismic hazard for both sites. Taking into account the difficulties to complete existing historical data in this subject the performance of four mathematical methods of aggregation has been studied. Original data have been obtained from the catalogs of the Spanish National Institute of Geography. The seismic events considered were chosen to cover evenly the historical records and the observed values of magnitude. A panel of experts have been applied to each site and four aggregation methods have been developed : equal weights, Cooke, Apostolakis-Mosleh and Morris The four proposals have been compaired with the actual data to judge their performance and ease of application. The results have shown that the Method of Cooke have proved the most efficient and robust for both sites. This method, besides, allow the reasoned identification of those experts who should be rejected from the panel
Resumo:
In this paper we show the possibility of applying adaptive procedures as an alternative to the well-known philosophy of standard Boundary Elements. The three characteristic steps of adaptive procedures, i.e. hierarchical shape functions families, indicator criteria, and a posteriori estimation, can be defined in order to govern an automatic refinement and stopping of the solution process. A computer program to treat potential problems, called QUEIMADA, has been developed to show the capabilities of the new idea.