20 resultados para Digital Manufacturing, Digital Mock Up, Simulation Intent
em Universidad Politécnica de Madrid
Resumo:
The design of a Final Assembly Line (FAL) is carry out in the product industrialization activity. The phase dealing with the definition of conceptual solutions is characterized by depending heavily on the personnel experience and being time-consuming. To enhance such process, it is proposed a development of a knowledge based software application to assist designers in the definition of scenarios and to generate conceptual FAL alternatives. Both the scenario and the generated FAL solution are part of the industrialization digital mock-up (IDMU). A commercial software application used in the aircraft programmes and supporting the IDMU concepts of: Product, Process and Resource; was selected to implement a software prototype. This communication presents the adopted methodological approach and the architecture of the developed application.
Resumo:
AIRBUS Military has undertaken a project to implement the industrial Digital Mock-Up (iDMU) concept to support the industrialization process of a medium size aerostructure. Within the framework of a collaborative engineering strategy, such project is part of the efforts to deploy Digital Manufacturing as a key technology for the industrialization of aircrafts assembly lines. The project has confirmed the potential of the iDMU to improve the industrial design process in a collaborative engineering environment. This communication presents the main project objectives, the key methodological points, the main project achievements and the next additional developments to increase the scope and benefits of the iDMU concept.
Resumo:
Airbus designs and industrializes aircrafts using Concurrent Engineering techniques since decades. The introduction of new PLM methods, procedures and tools, and the need to reduce time-to-market, led Airbus Military to pursue new working methods. Traditional Engineering works sequentially. Concurrent Engineering basically overlaps tasks between teams. Collaborative Engineering promotes teamwork to develop product, processes and resources from the conceptual phase to the start of the serial production. The CALIPSO-neo pilot project was launched to support the industrialization process of a medium size aerostructure. The aim is to implement the industrial Digital Mock-Up (iDMU) concept and its exploitation to create shop floor documentation. In a framework of a collaborative engineering strategy, the project is part of the efforts to deploy Digital Manufacturing as a key technology for the industrialization of aircraft assembly lines. This paper presents the context, the conceptual approach and the methodology adopted.
Resumo:
The prediction of the tritium production is required for handling procedures of samples, safety&maintenance and licensing of the International Fusion Materials Irradiation Facility (IFMIF).
Resumo:
A validation of the burn-up simulation system EVOLCODE 2.0 is presented here, involving the experimental measurement of U and Pu isotopes and some fission fragments production ratios after a burn-up of around 30 GWd/tU in a Pressurized Light Water Reactor (PWR). This work provides an in-depth analysis of the validation results, including the possible sources of the uncertainties. An uncertainty analysis based on the sensitivity methodology has been also performed, providing the uncertainties in the isotopic content propagated from the cross sections uncertainties. An improvement of the classical Sensitivity/ Uncertainty (S/U) model has been developed to take into account the implicit dependence of the neutron flux normalization, that is, the effect of the constant power of the reactor. The improved S/U methodology, neglected in this kind of studies, has proven to be an important contribution to the explanation of some simulation-experiment discrepancies for which, in general, the cross section uncertainties are, for the most relevant actinides, an important contributor to the simulation uncertainties, of the same order of magnitude and sometimes even larger than the experimental uncertainties and the experiment- simulation differences. Additionally, some hints for the improvement of the JEFF3.1.1 fission yield library and for the correction of some errata in the experimental data are presented.
Resumo:
T actitivity in LiPb LiPb mock-up material irradiated in Frascati: measurement and MCNP results
Resumo:
The application of liquid metal technology in fusion devices requires R&D related to many phenomena: interaction between liquid metals and structural material as corrosion, erosion and passivation techniques; magneto-hydrodynamics; free surface fluid-dynamics and any other physical aspect that will be needed for their safe reliable operation. In particular, there is a significant shortage of experimental facilities dedicated to the development of the lithium technology. In the framework of the TECHNOFUSION project, an experimental laboratory devoted to the lithium technology development is proposed, in order to shed some light in the path to IFMIF and the design of chamber's first wall and divertors. The conceptual design foresee a development in two stages, the first one consisting on a material testing loop. The second stage proposes the construction of a mock-up of the IFMIF target that will allow to assess the behaviour of a free-surface lithium target under vacuum conditions. In this paper, such conceptual design is addressed.
Resumo:
The French CEA, together with EDF and the IAEA, recently organised an international benchmark to evaluate the ability to model the mechanical behaviour of a typical nuclear reinforced concrete structure subjected to seismic demands. The participants were provided with descriptions of the structure and the testing campaign; they had to propose the numerical model and the material laws for the concrete (stage #1). A mesh of beam and shell elements was generated; for modelling the concrete a damaged plasticity model was used, but a smeared crack model was also investigated. Some of the initial experimental results, with the mock-up remaining in the elastic range, were provided to the participants for calibrating their models (stage #2). Predictions had to be produced in terms of eigen-frequencies and motion time histories. The calculated frequencies reproduced reasonably the experimental ones; the time histories, calculated by modal response analysis, also reproduced adequately the observed amplifications. The participants were then expected to predict the structural response under strong ground motions (stage #3), which increased progressively up to a history recorded during the 1994 Northridge earthquake, followed by an aftershock. These results were produced using an explicit solver and a damaged plasticity model for the concrete, although an implicit solver with a smeared crack model was also investigated. The paper presents the conclusions of the pre-test exercise, as well as some observations from additional simulations conducted after the experimental results were made available.
Resumo:
This paper aims to analyze the different adjustment methods commonly used to characterize indirect metrology circular features: least square circle, minimum zone circle, maximum inscribed circle and minimum circumscribed circle. The analysis was performed from images obtained by digital optical machines. The calculation algorithms, self-developed, have been implemented in Matlab® and take into consideration as study variables: the amplitude of angular sector of the circular feature, its nominal radio and the magnification used by the optical machine. Under different conditions, it was determined the radius and circularity error of different circular standards. The comparison of the results, obtained by the different methods of adjustments used, with certified values for the standards, has allowed us to determine the accuracy of each method and its scope.
Resumo:
En este proyecto se estudian y analizan las diferentes técnicas de procesado digital de señal aplicadas a acelerómetros. Se hace uso de una tarjeta de prototipado, basada en DSP, para realizar las diferentes pruebas. El proyecto se basa, principalmente, en realizar filtrado digital en señales provenientes de un acelerómetro en concreto, el 1201F, cuyo campo de aplicación es básicamente la automoción. Una vez estudiadas la teoría de procesado y las características de los filtros, diseñamos una aplicación basándonos sobre todo en el entorno en el que se desarrollaría una aplicación de este tipo. A lo largo del diseño, se explican las diferentes fases: diseño por ordenador (Matlab), diseño de los filtros en el DSP (C), pruebas sobre el DSP sin el acelerómetro, calibración del acelerómetro, pruebas finales sobre el acelerómetro... Las herramientas utilizadas son: la plataforma Kit de evaluación 21-161N de Analog Devices (equipado con el entorno de desarrollo Visual DSP 4.5++), el acelerómetro 1201F, el sistema de calibración de acelerómetros CS-18-LF de Spektra y los programas software MATLAB 7.5 y CoolEditPRO 2.0. Se realizan únicamente filtros IIR de 2º orden, de todos los tipos (Butterworth, Chebyshev I y II y Elípticos). Realizamos filtros de banda estrecha, paso-banda y banda eliminada, de varios tipos, dentro del fondo de escala que permite el acelerómetro. Una vez realizadas todas las pruebas, tanto simulaciones como físicas, se seleccionan los filtros que presentan un mejor funcionamiento y se analizan para obtener conclusiones. Como se dispone de un entorno adecuado para ello, se combinan los filtros entre sí de varias maneras, para obtener filtros de mayor orden (estructura paralelo). De esta forma, a partir de filtros paso-banda, podemos obtener otras configuraciones que nos darán mayor flexibilidad. El objetivo de este proyecto no se basa sólo en obtener buenos resultados en el filtrado, sino también de aprovechar las facilidades del entorno y las herramientas de las que disponemos para realizar el diseño más eficiente posible. In this project, we study and analize digital signal processing in order to design an accelerometer-based application. We use a hardware card of evaluation, based on DSP, to make different tests. This project is based in design digital filters for an automotion application. The accelerometer type is 1201F. First, we study digital processing theory and main parameters of real filters, to make a design based on the application environment. Along the application, we comment all the different steps: computer design (Matlab), filter design on the DSP (C language), simulation test on the DSP without the accelerometer, accelerometer calibration, final tests on the accelerometer... Hardware and software tools used are: Kit of Evaluation 21-161-N, based on DSP, of Analog Devices (equiped with software development tool Visual DSP 4.5++), 1201-F accelerometer, CS-18-LF calibration system of SPEKTRA and software tools MATLAB 7.5 and CoolEditPRO 2.0. We only perform 2nd orden IIR filters, all-type : Butterworth, Chebyshev I and II and Ellyptics. We perform bandpass and stopband filters, with very narrow band, taking advantage of the accelerometer's full scale. Once all the evidence, both simulations and physical, are finished, filters having better performance and analyzed and selected to draw conclusions. As there is a suitable environment for it, the filters are combined together in different ways to obtain higher order filters (parallel structure). Thus, from band-pass filters, we can obtain many configurations that will give us greater flexibility. The purpose of this project is not only based on good results in filtering, but also to exploit the facilities of the environment and the available tools to make the most efficient design possible.
Resumo:
Abstract The creation of atlases, or digital models where information from different subjects can be combined, is a field of increasing interest in biomedical imaging. When a single image does not contain enough information to appropriately describe the organism under study, it is then necessary to acquire images of several individuals, each of them containing complementary data with respect to the rest of the components in the cohort. This approach allows creating digital prototypes, ranging from anatomical atlases of human patients and organs, obtained for instance from Magnetic Resonance Imaging, to gene expression cartographies of embryo development, typically achieved from Light Microscopy. Within such context, in this PhD Thesis we propose, develop and validate new dedicated image processing methodologies that, based on image registration techniques, bring information from multiple individuals into alignment within a single digital atlas model. We also elaborate a dedicated software visualization platform to explore the resulting wealth of multi-dimensional data and novel analysis algo-rithms to automatically mine the generated resource in search of bio¬logical insights. In particular, this work focuses on gene expression data from developing zebrafish embryos imaged at the cellular resolution level with Two-Photon Laser Scanning Microscopy. Disposing of quantitative measurements relating multiple gene expressions to cell position and their evolution in time is a fundamental prerequisite to understand embryogenesis multi-scale processes. However, the number of gene expressions that can be simultaneously stained in one acquisition is limited due to optical and labeling constraints. These limitations motivate the implementation of atlasing strategies that can recreate a virtual gene expression multiplex. The developed computational tools have been tested in two different scenarios. The first one is the early zebrafish embryogenesis where the resulting atlas constitutes a link between the phenotype and the genotype at the cellular level. The second one is the late zebrafish brain where the resulting atlas allows studies relating gene expression to brain regionalization and neurogenesis. The proposed computational frameworks have been adapted to the requirements of both scenarios, such as the integration of partial views of the embryo into a whole embryo model with cellular resolution or the registration of anatom¬ical traits with deformable transformation models non-dependent on any specific labeling. The software implementation of the atlas generation tool (Match-IT) and the visualization platform (Atlas-IT) together with the gene expression atlas resources developed in this Thesis are to be made freely available to the scientific community. Lastly, a novel proof-of-concept experiment integrates for the first time 3D gene expression atlas resources with cell lineages extracted from live embryos, opening up the door to correlate genetic and cellular spatio-temporal dynamics. La creación de atlas, o modelos digitales, donde la información de distintos sujetos puede ser combinada, es un campo de creciente interés en imagen biomédica. Cuando una sola imagen no contiene suficientes datos como para describir apropiadamente el organismo objeto de estudio, se hace necesario adquirir imágenes de varios individuos, cada una de las cuales contiene información complementaria respecto al resto de componentes del grupo. De este modo, es posible crear prototipos digitales, que pueden ir desde atlas anatómicos de órganos y pacientes humanos, adquiridos por ejemplo mediante Resonancia Magnética, hasta cartografías de la expresión genética del desarrollo de embrionario, típicamente adquiridas mediante Microscopía Optica. Dentro de este contexto, en esta Tesis Doctoral se introducen, desarrollan y validan nuevos métodos de procesado de imagen que, basándose en técnicas de registro de imagen, son capaces de alinear imágenes y datos provenientes de múltiples individuos en un solo atlas digital. Además, se ha elaborado una plataforma de visualization específicamente diseñada para explorar la gran cantidad de datos, caracterizados por su multi-dimensionalidad, que resulta de estos métodos. Asimismo, se han propuesto novedosos algoritmos de análisis y minería de datos que permiten inspeccionar automáticamente los atlas generados en busca de conclusiones biológicas significativas. En particular, este trabajo se centra en datos de expresión genética del desarrollo embrionario del pez cebra, adquiridos mediante Microscopía dos fotones con resolución celular. Disponer de medidas cuantitativas que relacionen estas expresiones genéticas con las posiciones celulares y su evolución en el tiempo es un prerrequisito fundamental para comprender los procesos multi-escala característicos de la morfogénesis. Sin embargo, el número de expresiones genéticos que pueden ser simultáneamente etiquetados en una sola adquisición es reducido debido a limitaciones tanto ópticas como del etiquetado. Estas limitaciones requieren la implementación de estrategias de creación de atlas que puedan recrear un multiplexado virtual de expresiones genéticas. Las herramientas computacionales desarrolladas han sido validadas en dos escenarios distintos. El primer escenario es el desarrollo embrionario temprano del pez cebra, donde el atlas resultante permite constituir un vínculo, a nivel celular, entre el fenotipo y el genotipo de este organismo modelo. El segundo escenario corresponde a estadios tardíos del desarrollo del cerebro del pez cebra, donde el atlas resultante permite relacionar expresiones genéticas con la regionalización del cerebro y la formación de neuronas. La plataforma computacional desarrollada ha sido adaptada a los requisitos y retos planteados en ambos escenarios, como la integración, a resolución celular, de vistas parciales dentro de un modelo consistente en un embrión completo, o el alineamiento entre estructuras de referencia anatómica equivalentes, logrado mediante el uso de modelos de transformación deformables que no requieren ningún marcador específico. Está previsto poner a disposición de la comunidad científica tanto la herramienta de generación de atlas (Match-IT), como su plataforma de visualización (Atlas-IT), así como las bases de datos de expresión genética creadas a partir de estas herramientas. Por último, dentro de la presente Tesis Doctoral, se ha incluido una prueba conceptual innovadora que permite integrar los mencionados atlas de expresión genética tridimensionales dentro del linaje celular extraído de una adquisición in vivo de un embrión. Esta prueba conceptual abre la puerta a la posibilidad de correlar, por primera vez, las dinámicas espacio-temporales de genes y células.
Resumo:
The award of the digital dividend can consolidate auctions as the preferred mechanism for spectrum allocation. Knowing in advance an estimate of what the results of an auction with these characteristics could be would be unquestionably useful for those in charge of designing the process, even if at the end another method such as a beauty contest is chosen. This article provides a simulation of a digital dividend auction in a major-type European country. In one of the scenarios, the spectrum is not pre-allocated to any service in particular (service neutrality) while in the remaining four, blocks of spectrum are pre-allocated to DTT, mobile multimedia and mobile broadband communications. The results of the simulations reveal that the service neutrality scenario maximizes revenues for the seller and that, in general, DTT operators would seem to have fewer opportunities as the spectrum packaging is less protective for them.
Resumo:
Digital image correlation (DIC) is applied to analyzing the deformation mechanisms under transverse compression in a fiber-reinforced composite. To this end, compression tests in a direction perpendicular to the fibers were carried out inside a scanning electron microscope and secondary electron images obtained at different magnifications during the test. Optimum DIC parameters to resolve the displacement and strain field were computed from numerical simulations of a model composite and they were applied to micrographs obtained at different magnifications (250_, 2000_, and 6000_). It is shown that DIC of low-magnification micrographs was able to capture the long range fluctuations in strain due to the presence of matrix-rich and fiber-rich zones, responsible for the onset of damage. At higher magnification, the strain fields obtained with DIC qualitatively reproduce the non-homogeneous deformation pattern due to the presence of stiff fibers dispersed in a compliant matrix and provide accurate results of the average composite strain. However, comparison with finite element simulations revealed that DIC was not able to accurately capture the average strain in each phase.
Resumo:
In this paper an approach to the synchronization of chaotic circuits has been reported. It is based on an optically programmable logic cell and the signals involved are fully digital. It is based on the reception of the same input signal on sender and receiver and from this approach, with a posterior correlation between both outputs, an identical chaotic output is obtained in both systems. No conversion from analog to digital signals is needed. The model here presented is based on a computer simulation.
Resumo:
El mundo tecnológico está cambiando hacia la optimización en la gestión de recursos gracias a la poderosa influencia de tecnologías como la virtualización y la computación en la nube (Cloud Computing). En esta memoria se realiza un acercamiento a las mismas, desde las causas que las motivaron hasta sus últimas tendencias, pasando por la identificación de sus principales características, ventajas e inconvenientes. Por otro lado, el Hogar Digital es ya una realidad para la mayoría de los seres humanos. En él se dispone de acceso a múltiples tipos de redes de telecomunicaciones (3G, 4G, WI-FI, ADSL…) con más o menos capacidad pero que permiten conexiones a internet desde cualquier parte, en todo momento, y con prácticamente cualquier dispositivo (ordenadores personales, smartphones, tabletas, televisores…). Esto es aprovechado por las empresas para ofrecer todo tipo de servicios. Algunos de estos servicios están basados en el cloud computing sobre todo ofreciendo almacenamiento en la nube a aquellos dispositivos con capacidad reducida, como son los smarthphones y las tabletas. Ese espacio de almacenamiento normalmente está en los servidores bajo el control de grandes compañías. Guardar documentos, videos, fotos privadas sin tener la certeza de que estos no son consultados por alguien sin consentimiento, puede despertar en el usuario cierto recelo. Para estos usuarios que desean control sobre su intimidad, se ofrece la posibilidad de que sea el propio usuario el que monte sus propios servidores y su propio servicio cloud para compartir su información privada sólo con sus familiares y amigos o con cualquiera al que le dé permiso. Durante el proyecto se han comparado diversas soluciones, la mayoría de código abierto y de libre distribución, que permiten desplegar como mínimo un servicio de almacenamiento accesible a través de Internet. Algunas de ellas lo complementan con servicios de streaming tanto de música como de videos, compartición y sincronización de documentos entre múltiples dispositivos, calendarios, copias de respaldo (backups), virtualización de escritorios, versionado de ficheros, chats, etc. El proyecto finaliza con una demostración de cómo utilizar dispositivos de un hogar digital interactuando con un servidor Cloud, en el que previamente se ha instalado y configurado una de las soluciones comparadas. Este servidor quedará empaquetado en una máquina virtual para que sea fácilmente transportable e utilizable. ABSTRACT. The technological world is changing towards optimizing resource management thanks to the powerful influence of technologies such as Virtualization and Cloud Computing. This document presents a closer approach to them, from the causes that have motivated to their last trends, as well as showing their main features, advantages and disadvantages. In addition, the Digital Home is a reality for most humans. It provides access to multiple types of telecommunication networks (3G, 4G, WI-FI, ADSL...) with more or less capacity, allowing Internet connections from anywhere, at any time, and with virtually any device (computer personal smartphones, tablets, televisions...).This is used by companies to provide all kinds of services. Some of these services offer storage on the cloud to devices with limited capacity, such as smartphones and tablets. That is normally storage space on servers under the control of important companies. Saving private documents, videos, photos, without being sure that they are not viewed by anyone without consent, can wake up suspicions in some users. For those users who want control over their privacy, it offers the possibility that it is the user himself to mount his own server and its own cloud service to share private information only with family and friends or with anyone with consent. During the project I have compared different solutions, most open source and with GNU licenses, for deploying one storage facility accessible via the Internet. Some supplement include streaming services of music , videos or photos, sharing and syncing documents across multiple devices, calendars, backups, desktop virtualization, file versioning, chats... The project ends with a demonstration of how to use our digital home devices interacting with a cloud server where one of the solutions compared is installed and configured. This server will be packaged in a virtual machine to be easily transportable and usable.