42 resultados para Virtual Reality Structural Engineering Design
Resumo:
Purpose: Surgical simulators are currently essential within any laparoscopic training program because they provide a low-stakes, reproducible and reliable environment to acquire basic skills. The purpose of this study is to determine the training learning curve based on different metrics corresponding to five tasks included in SINERGIA laparoscopic virtual reality simulator. Methods: Thirty medical students without surgical experience participated in the study. Five tasks of SINERGIA were included: Coordination, Navigation, Navigation and touch, Accurate grasping and Coordinated pulling. Each participant was trained in SINERGIA. This training consisted of eight sessions (R1–R8) of the five mentioned tasks and was carried out in two consecutive days with four sessions per day. A statistical analysis was made, and the results of R1, R4 and R8 were pair-wise compared with Wilcoxon signed-rank test. Significance is considered at P value <0.005. Results: In total, 84.38% of the metrics provided by SINERGIA and included in this study show significant differences when comparing R1 and R8. Metrics are mostly improved in the first session of training (75.00% when R1 and R4 are compared vs. 37.50% when R4 and R8 are compared). In tasks Coordination and Navigation and touch, all metrics are improved. On the other hand, Navigation just improves 60% of the analyzed metrics. Most learning curves show an improvement with better results in the fulfillment of the different tasks. Conclusions: Learning curves of metrics that assess the basic psychomotor laparoscopic skills acquired in SINERGIA virtual reality simulator show a faster learning rate during the first part of the training. Nevertheless, eight repetitions of the tasks are not enough to acquire all psychomotor skills that can be trained in SINERGIA. Therefore, and based on these results together with previous works, SINERGIA could be used as training tool with a properly designed training program.
Resumo:
The Hispanic Rite is the liturgy celebrated by Christians of the Iberian Peninsula before the imposition of the Roman Rite in the mid-eleventh century. As in other early Christian liturgies, music was the core of the Hispanic Rite. This music, known as Mozarabic Chant is one of the richest musical repertoires of the Middle Ages. Currently, a research project is underway involving the restoration of the Hispanic Rite sound, using techniques of acoustic virtual reality. The project aims to perform the auralization of the sound of Mozarabic Chant in his primitive environment, that is, taking into account the acoustic characteristics of the pre-Romanesque churches in their original state. For this purpose, anechoic recordings were made for a number of musical pieces representative of the Mozarabic Chant repertoire. In total eight (8) musical pieces have been recorded and interpreted, each of one, by six (6) different singers. The recordings were made using a spherical array composed by 32 microphones. This paper describes the more relevant aspects related to the recorded musical material, the technical specifications and installation details of the recording equipment, the data processing, and a summary of the results.
Resumo:
This paper deals with the assessment of the contribution of the second flexural mode to the dynamic behaviour of simply supported railway bridges. Alluding to the works of other authors, it is suggested in some references that the dynamic behaviour of simply supported bridges could be adequately represented taking into account only the contribution of the fundamental flexural mode. On the other hand, the European Rail Research Institute (ERRI) proposes that the second mode should also be included whenever the associated natural frequency is lower than 30 Hz]. This investigation endeavours to clarify the question as much as possible by establishing whether the maximum response of the bridge, in terms of displacements, accelerations and bending moments, can be computed accurately not taking account of the contribution of the second mode. To this end, a dimensionless formulation of the equations of motion of a simply supported beam traversed by a series of equally spaced moving loads is presented. This formulation brings to light the fundamental parameters governing the behaviour of the beam: damping ratio, dimensionless speed $ \alpha$=VT/L, and L/d ratio (L stands for the span of the beam, V for the speed of the train, T represents the fundamental period of the bridge and d symbolises the distance between consecutive loads). Assuming a damping ratio equal to 1%, which is a usual value for prestressed high-speed bridges, a parametric analysis is conducted over realistic ranges of values of $ \alpha$ and L/d. The results can be extended to any simply supported bridge subjected to a train of equally spaced loads in virtue of the so-called Similarity Formulae. The validity of these formulae can be derived from the dimensionless formulation mentioned above. In the parametric analysis the maximum response of the bridge is obtained for one thousand values of speed that cover the range from the fourth resonance of the first mode to the first resonance of the second mode. The response at twenty-one different locations along the span of the beam is compared in order to decide if the maximum can be accurately computed with the sole contribution of the fundamental mode.
Resumo:
Numerous damage models have been developed in order to analyse the seismic behavior. Among the different possibilities existing in the literature, it is very clear that models developed along the lines of Continuum Damage Mechanics are more consistent with the definition of damage like a phenomenon with mechanical consequences as they include explicitly the coupling between damage and mechanical behavior. On the other hand, for seismic processes, phenomena such as low cycle fatigue may have a pronounced effect on the overall behavior of the frames and, therefore, its consideration turns out to be very important. However, many of existing models evaluate the damage only as a function of the maximum amplitude of cyclic deformation without considering the number of cycles. In this paper, a generalization of the simplified model proposed by Flórez is made in order to include the low cycle fatigue. Such model employs in its formulation irreversible thermodynamics and internal state variable theory.
Experimental Prototype Merging Stereo Panoramic Video and Interactive 3D Content in a 5-sided CAVETM
Resumo:
Immersion and interaction have been identified as key factors influencing the quality of experience in stereoscopic video systems. An experimental prototype designed to explore the influence of these factors in 3D video applications is described here1. The focus is on the real-time insertion algorithm of new 3D models into the original video streams. Using this algorithm, our prototype is aimed to explore a new interaction paradigm ? similar to the augmented reality approach ? with 3D video applications.
Resumo:
The physical model based on moving constant loads is widely used for the analysis of railway bridges. Nevertheless, this model is not well-suited for the study of short span bridges (L<=15-20 m), and the results it produces (displacements and accelerations) are much greater than those obtained experimentally. In this paper two factors are analysed which are believed to have an influence in the dynamic behaviour of short bridges. These two factors are not accounted for by the moving loads model and are the following: the distribution of the loads due to the presence of the sleepers and ballast layer, and the train-bridge interaction. Several numerical simulations have been performed in order to decide on their influence, and the results are presented and discussed herein.
Resumo:
“Por lo tanto, la cristalización de polímeros se supone, y en las teorías se describe a menudo, como un proceso de múltiples pasos con muchos aspectos físico-químicos y estructurales influyendo en él. Debido a la propia estructura de la cadena, es fácil entender que un proceso que es termodinámicamente forzado a aumentar su ordenamiento local, se vea obstaculizado geométricamente y, por tanto, no puede conducirse a un estado de equilibrio final. Como resultado, se forman habitualmente estructuras de no equilibrio con diferentes características dependiendo de la temperatura, presión, cizallamiento y otros parámetros físico-químicos del sistema”. Estas palabras, pronunciadas recientemente por el profesor Bernhard Wunderlich, uno de los mas relevantes fisico-quimicos que han abordado en las ultimas décadas el estudio del estado físico de las macromoléculas, adelantan lo que de alguna manera se explicita en esta memoria y constituyen el “leitmotiv” de este trabajo de tesis. El mecanismo de la cristalización de polímeros esta aun bajo debate en la comunidad de la física de polímeros y la mayoría de los abordajes experimentales se explican a través de la teoría LH. Esta teoría clásica debida a Lauritzen y Hoffman (LH), y que es una generalización de la teoría de cristalización de una molécula pequeña desde la fase de vapor, describe satisfactoriamente muchas observaciones experimentales aunque esta lejos de explicar el complejo fenómeno de la cristalización de polímeros. De hecho, la formulación original de esta teoría en el National Bureau of Standards, a comienzos de la década de los 70, sufrió varias reformulaciones importantes a lo largo de la década de los 80, buscando su adaptación a los hallazgos experimentales. Así nació el régimen III de cristalización que posibilita la creacion de nichos moleculares en la superficie y que dio pie al paradigma ofrecido por Sadler y col., para justificar los experimentos que se obtenian por “scattering” de neutrones y otras técnicas como la técnica de “droplets” o enfriamiento rapido. Por encima de todo, el gran éxito de la teoría radica en que explica la dependencia inversa entre el tamaño del plegado molecular y el subenfriamiento, definido este ultimo como el intervalo de temperatura que media entre la temperatura de equilibrio y la temperatura de cristalización. El problema concreto que aborda esta tesis es el estudio de los procesos de ordenamiento de poliolefinas con distinto grado de ramificacion mediante simulaciones numéricas. Los copolimeros estudiados en esta tesis se consideran materiales modelo de gran homogeneidad molecular desde el punto de vista de la distribución de tamaños y de ramificaciones en la cadena polimérica. Se eligieron estas poliolefinas debido al gran interes experimental en conocer el cambio en las propiedades fisicas de los materiales dependiendo del tipo y cantidad de comonomero utilizado. Además, son modelos sobre los que existen una ingente cantidad de información experimental, que es algo que preocupa siempre al crear una realidad virtual como es la simulación. La experiencia en el grupo Biophym es que los resultados de simulación deben de tener siempre un correlato mas o menos próximo experimental y ese argumento se maneja a lo largo de esta memoria. Empíricamente, se conoce muy bien que las propiedades físicas de las poliolefinas, en suma dependen del tipo y de la cantidad de ramificaciones que presenta el material polimérico. Sin embargo, tal como se ha explicado no existen modelos teóricos adecuados que expliquen los mecanismos subyacentes de los efectos de las ramas. La memoria de este trabajo es amplia por la complejidad del tema. Se inicia con una extensa introducción sobre los conceptos básicos de una macromolecula que son relevantes para entender el contenido del resto de la memoria. Se definen los conceptos de macromolecula flexible, distribuciones y momentos, y su comportamiento en disolución y fundido con los correspondientes parametros caracteristicos. Se pone especial énfasis en el concepto de “entanglement” o enmaranamiento por considerarse clave a la hora de tratar macromoléculas con una longitud superior a la longitud critica de enmaranamiento. Finaliza esta introducción con una reseña sobre el estado del arte en la simulación de los procesos de cristalización. En un segundo capitulo del trabajo se expone detalladamente la metodología usada en cada grupo de casos. En el primer capitulo de resultados, se discuten los estudios de simulación en disolución diluida para sistemas lineales y ramificados de cadena única. Este caso mas simple depende claramente del potencial de torsión elegido tal como se discute a lo largo del texto. La formación de los núcleos “babys” propuestos por Muthukumar parece que son consecuencia del potencial de torsión, ya que este facilita los estados de torsión mas estables. Así que se propone el análisis de otros potenciales que son igualmente utilizados y los resultados obtenidos sobre la cristalización, discutidos en consecuencia. Seguidamente, en un segundo capitulo de resultados se estudian moleculas de alcanos de cadena larga lineales y ramificados en un fundido por simulaciones atomisticas como un modelo de polietileno. Los resultados atomisticos pese a ser de gran detalle no logran captar en su totalidad los efectos experimentales que se observan en los fundidos subenfriados en su etapa previa al estado ordenado. Por esta razon se discuten en los capítulos 3 y 4 de resultados sistemas de cadenas cortas y largas utilizando dos modelos de grano grueso (CG-PVA y CG-PE). El modelo CG-PE se desarrollo durante la tesis. El uso de modelos de grano grueso garantiza una mayor eficiencia computacional con respecto a los modelos atomísticos y son suficientes para mostrar los fenómenos a la escala relevante para la cristalización. En todos estos estudios mencionados se sigue la evolución de los procesos de ordenamiento y de fusión en simulaciones de relajación isoterma y no isoterma. Como resultado de los modelos de simulación, se han evaluado distintas propiedades fisicas como la longitud de segmento ordenado, la cristalinidad, temperaturas de fusion/cristalizacion, etc., lo que permite una comparación con los resultados experimentales. Se demuestra claramente que los sistemas ramificados retrasan y dificultan el orden de la cadena polimérica y por tanto, las regiones cristalinas ordenadas decrecen al crecer las ramas. Como una conclusión general parece mostrarse una tendencia a la formación de estructuras localmente ordenadas que crecen como bloques para completar el espacio de cristalización que puede alcanzarse a una temperatura y a una escala de tiempo determinada. Finalmente hay que señalar que los efectos observados, estan en concordancia con otros resultados tanto teoricos/simulacion como experimentales discutidos a lo largo de esta memoria. Su resumen se muestra en un capitulo de conclusiones y líneas futuras de investigación que se abren como consecuencia de esta memoria. Hay que mencionar que el ritmo de investigación se ha acentuado notablemente en el ultimo ano de trabajo, en parte debido a las ventajas notables obtenidas por el uso de la metodología de grano grueso que pese a ser muy importante para esta memoria no repercute fácilmente en trabajos publicables. Todo ello justifica que gran parte de los resultados esten en fase de publicación. Abstract “Polymer crystallization is therefore assumed, and in theories often described, to be a multi step process with many influencing aspects. Because of the chain structure, it is easy to understand that a process which is thermodynamically forced to increase local ordering but is geometrically hindered cannot proceed into a final equilibrium state. As a result, nonequilibrium structures with different characteristics are usually formed, which depend on temperature, pressure, shearing and other parameters”. These words, recently written by Professor Bernhard Wunderlich, one of the most prominent researchers in polymer physics, put somehow in value the "leitmotiv "of this thesis. The crystallization mechanism of polymers is still under debate in the physics community and most of the experimental findings are still explained by invoking the LH theory. This classical theory, which was initially formulated by Lauritzen and Hoffman (LH), is indeed a generalization of the crystallization theory for small molecules from the vapor phase. Even though it describes satisfactorily many experimental observations, it is far from explaining the complex phenomenon of polymer crystallization. This theory was firstly devised in the early 70s at the National Bureau of Standards. It was successively reformulated along the 80s to fit the experimental findings. Thus, the crystallization regime III was introduced into the theory in order to explain the results found in neutron scattering, droplet or quenching experiments. This concept defines the roughness of the crystallization surface leading to the paradigm proposed by Sadler et al. The great success of this theory is the ability to explain the inverse dependence of the molecular folding size on the supercooling, the latter defined as the temperature interval between the equilibrium temperature and the crystallization temperature. The main scope of this thesis is the study of ordering processes in polyolefins with different degree of branching by using computer simulations. The copolymers studied along this work are considered materials of high molecular homogeneity, from the point of view of both size and branching distributions of the polymer chain. These polyolefins were selected due to the great interest to understand their structure– property relationships. It is important to note that there is a vast amount of experimental data concerning these materials, which is essential to create a virtual reality as is the simulation. The Biophym research group has a wide experience in the correlation between simulation data and experimental results, being this idea highly alive along this work. Empirically, it is well-known that the physical properties of the polyolefins depend on the type and amount of branches presented in the polymeric material. However, there are not suitable models to explain the underlying mechanisms associated to branching. This report is extensive due to the complexity of the topic under study. It begins with a general introduction to the basics concepts of macromolecular physics. This chapter is relevant to understand the content of the present document. Some concepts are defined along this section, among others the flexibility of macromolecules, size distributions and moments, and the behavior in solution and melt along with their corresponding characteristic parameters. Special emphasis is placed on the concept of "entanglement" which is a key item when dealing with macromolecules having a molecular size greater than the critical entanglement length. The introduction finishes with a review of the state of art on the simulation of crystallization processes. The second chapter of the thesis describes, in detail, the computational methodology used in each study. In the first results section, we discuss the simulation studies in dilute solution for linear and short chain branched single chain models. The simplest case is clearly dependent on the selected torsion potential as it is discussed throughout the text. For example, the formation of baby nuclei proposed by Mutukhumar seems to result from the effects of the torsion potential. Thus, we propose the analysis of other torsion potentials that are also used by other research groups. The results obtained on crystallization processes are accordingly discussed. Then, in a second results section, we study linear and branched long-chain alkane molecules in a melt by atomistic simulations as a polyethylene-like model. In spite of the great detail given by atomistic simulations, they are not able to fully capture the experimental facts observed in supercooled melts, in particular the pre-ordered states. For this reason, we discuss short and long chains systems using two coarse-grained models (CG-PVA and CG-PE) in section 3 and 4 of chapter 2. The CG-PE model was developed during the thesis. The use of coarse-grained models ensures greater computational efficiency with respect to atomistic models and is enough to show the relevant scale phenomena for crystallization. In all the analysis we follow the evolution of the ordering and melting processes by both isothermal and non isothermal simulations. During this thesis we have obtained different physical properties such as stem length, crystallinity, melting/crystallization temperatures, and so on. We show that branches in the chains cause a delay in the crystallization and hinder the ordering of the polymer chain. Therefore, crystalline regions decrease in size as branching increases. As a general conclusion, it seems that there is a tendency in the macromolecular systems to form ordered structures, which can grown locally as blocks, occupying the crystallization space at a given temperature and time scale. Finally it should be noted that the observed effects are consistent with both, other theoretical/simulation and experimental results. The summary is provided in the conclusions chapter along with future research lines that open as result of this report. It should be mentioned that the research work has speeded up markedly in the last year, in part because of the remarkable benefits obtained by the use of coarse-grained methodology that despite being very important for this thesis work, is not easily publishable by itself. All this justify that most of the results are still in the publication phase.
Resumo:
En los diseños y desarrollos de ingeniería, antes de comenzar la construcción e implementación de los objetivos de un proyecto, es necesario realizar una serie de análisis previos y simulaciones que corroboren las expectativas de la hipótesis inicial, con el fin de obtener una referencia empírica que satisfaga las condiciones de trabajo o funcionamiento de los objetivos de dicho proyecto. A menudo, los resultados que satisfacen las características deseadas se obtienen mediante la iteración de métodos de ensayo y error. Generalmente, éstos métodos utilizan el mismo procedimiento de análisis con la variación de una serie de parámetros que permiten adaptar una tecnología a la finalidad deseada. Hoy en día se dispone de computadoras potentes, así como algoritmos de resolución matemática que permiten resolver de forma veloz y eficiente diferentes tipos de problemas de cálculo. Resulta interesante el desarrollo de aplicaciones que permiten la resolución de éstos problemas de forma rápida y precisa en el análisis y síntesis de soluciones de ingeniería, especialmente cuando se tratan expresiones similares con variaciones de constantes, dado que se pueden desarrollar instrucciones de resolución con la capacidad de inserción de parámetros que definan el problema. Además, mediante la implementación de un código de acuerdo a la base teórica de una tecnología, se puede lograr un código válido para el estudio de cualquier problema relacionado con dicha tecnología. El desarrollo del presente proyecto pretende implementar la primera fase del simulador de dispositivos ópticos Slabsim, en cual se puede representar la distribución de la energía de una onda electromagnética en frecuencias ópticas guiada a través de una una guía dieléctrica plana, también conocida como slab. Este simulador esta constituido por una interfaz gráfica generada con el entorno de desarrollo de interfaces gráficas de usuario Matlab GUIDE, propiedad de Mathworks©, de forma que su manejo resulte sencillo e intuitivo para la ejecución de simulaciones con un bajo conocimiento de la base teórica de este tipo de estructuras por parte del usuario. De este modo se logra que el ingeniero requiera menor intervalo de tiempo para encontrar una solución que satisfaga los requisitos de un proyecto relacionado con las guías dieléctricas planas, e incluso utilizarlo para una amplia diversidad de objetivos basados en esta tecnología. Uno de los principales objetivos de este proyecto es la resolución de la base teórica de las guías slab a partir de métodos numéricos computacionales, cuyos procedimientos son extrapolables a otros problemas matemáticos y ofrecen al autor una contundente base conceptual de los mismos. Por este motivo, las resoluciones de las ecuaciones diferenciales y características que constituyen los problemas de este tipo de estructuras se realizan por estos medios de cálculo en el núcleo de la aplicación, dado que en algunos casos, no existe la alternativa de uso de expresiones analíticas útiles. ABSTRACT. The first step in engineering design and development is an analysis and simulation process which will successfully corroborate the initial hypothesis that was made and find solutions for a particular. In this way, it is possible to obtain empirical evidence which suitably substantiate the purposes of the project. Commonly, the characteristics to reach a particular target are found through iterative trial and error methods. These kinds of methods are based on the same theoretical analysis but with a variation of some parameters, with the objective to adapt the results for a particular aim. At present, powerful computers and mathematical algorithms are available to solve different kinds of calculation problems in a fast and efficient way. Computing application development is useful as it gives a high level of accurate results for engineering analysis and synthesis in short periods of time. This is more notable in cases where the mathematical expressions on a theoretical base are similar but with small variations of constant values. This is due to the ease of adaptation of the computer programming code into a parameter request system that defines a particular solution on each execution. Additionally, it is possible to code an application suitable to simulate any issue related to the studied technology. The aim of the present project consists of the construction of the first stage of an optoelectronics simulator named Slabsim. Slabism is capable of representing the energetic distribution of a light wave guided in the volume of a slab waveguide. The mentioned simulator is made through the graphic user interface development environment Matlab GUIDE, property of Mathworks©. It is designed for an easy and intuitive management by the user to execute simulations with a low knowledge of the technology theoretical bases. With this software it is possible to achieve several aims related to the slab waveguides by the user in low interval of time. One of the main purposes of this project is the mathematical solving of theoretical bases of slab structures through computing numerical analysis. This is due to the capability of adapting its criterion to other mathematical issues and provides a strong knowledge of its process. Based on these advantages, numerical solving methods are used in the core of the simulator to obtain differential and characteristic equations results that become represented on it.
Resumo:
Virtual reality (VR) techniques to understand and obtain conclusions of data in an easy way are being used by the scientific community. However, these techniques are not used frequently for analyzing large amounts of data in life sciences, particularly in genomics, due to the high complexity of data (curse of dimensionality). Nevertheless, new approaches that allow to bring out the real important data characteristics, arise the possibility of constructing VR spaces to visually understand the intrinsic nature of data. It is well known the benefits of representing high dimensional data in tridimensional spaces by means of dimensionality reduction and transformation techniques, complemented with a strong component of interaction methods. Thus, a novel framework, designed for helping to visualize and interact with data about diseases, is presented. In this paper, the framework is applied to the Van't Veer breast cancer dataset is used, while oncologists from La Paz Hospital (Madrid) are interacting with the obtained results. That is to say a first attempt to generate a visually tangible model of breast cancer disease in order to support the experience of oncologists is presented.
Resumo:
The use of data mining techniques for the gene profile discovery of diseases, such as cancer, is becoming usual in many researches. These techniques do not usually analyze the relationships between genes in depth, depending on the different variety of manifestations of the disease (related to patients). This kind of analysis takes a considerable amount of time and is not always the focus of the research. However, it is crucial in order to generate personalized treatments to fight the disease. Thus, this research focuses on finding a mechanism for gene profile analysis to be used by the medical and biologist experts. Results: In this research, the MedVir framework is proposed. It is an intuitive mechanism based on the visualization of medical data such as gene profiles, patients, clinical data, etc. MedVir, which is based on an Evolutionary Optimization technique, is a Dimensionality Reduction (DR) approach that presents the data in a three dimensional space. Furthermore, thanks to Virtual Reality technology, MedVir allows the expert to interact with the data in order to tailor it to the experience and knowledge of the expert.
Resumo:
The Semantics Difficulty Model (SDM) is a model that measures the difficult of introducing semantics technology into a company. SDM manages three descriptions of stages, which we will refer to as ?snapshots?: a company semantic snapshot, data snapshot and semantic application snapshot. Understanding a priory the complexity of introducing semantics into a company is important because it allows the organization to take early decisions, thus saving time and money, mitigating risks and improving innovation, time to market and productivity. SDM works by measuring the distance between each initial snapshot and its reference models (the company semantic snapshots reference model, data snapshots reference model, and the semantic application snapshots reference model) with Euclidian distances. The difficulty level will be "not at all difficult" when the distance is small, and becomes "extremely difficult" when the the distance is large. SDM has been tested experimentally with 2000 simulated companies with arrangements and several initial stages. The output is measured by five linguistic values: "not at all difficult, slightly difficult, averagely difficult, very difficult and extremely difficult". As the preliminary results of our SDM simulation model indicate, transforming a search application into integrated data from different sources with semantics is a "slightly difficult", in contrast with data and opinion extraction applications for which it is "very difficult".
Resumo:
Las tecnologías de realidad acústica virtual ofrecen una herramienta muy apropiada para la reconstrucción del patrimonio inmaterial del sonido de los recintos históricos. Este trabajo es parte de un proyecto de investigación cuyo objetivo es la restauración virtual del sonido del Antiguo Rito Hispánico y que consiste en la auralización del Canto Mozárabe en una serie de iglesias pre-Románicas de la península ibérica. En este caso se presentan los resultados más relevantes de las auralizaciones realizadas para la iglesia de Santa María de Melque. Para ello se ha elaborado un modelo acústico virtual de la iglesia en las condiciones que, según la documentación arqueológica, tenía el recinto original, se han realizado grabaciones anecoicas de una serie de piezas del repertorio primitivo del Canto Mozárabe y se han efectuado las auralizaciones correspondientes a diferentes configuraciones litúrgicas del Antiguo Rito Hispánico. ABSTRACT Acoustic Virtual Reality technology offers a highly appropriate tool for the reconstruction of the acoustic intangible heritage of the sound of historical enclosures. This work is part of a research project whose aim is the virtual restoration of the sound of the Old Hispanic Rite, auralizing the Mozarabic Chant in Pre-Romanesque churches of the Iberian Peninsula. This paper shows the most relevant results of the auralization of Santa María de Melque church. For that purpose, an acoustic virtual model has been created according to archaeological documentation of the original building conditions, anechoic recordings of several Early Mozarabic Chant musical pieces have been recorded and auralization corresponding to Old Hispanic liturgical Rite multiple settings has been completed.