28 resultados para visualisation of acoustic data
em Universidad Politécnica de Madrid
Resumo:
Researchers in ecology commonly use multivariate analyses (e.g. redundancy analysis, canonical correspondence analysis, Mantel correlation, multivariate analysis of variance) to interpret patterns in biological data and relate these patterns to environmental predictors. There has been, however, little recognition of the errors associated with biological data and the influence that these may have on predictions derived from ecological hypotheses. We present a permutational method that assesses the effects of taxonomic uncertainty on the multivariate analyses typically used in the analysis of ecological data. The procedure is based on iterative randomizations that randomly re-assign non identified species in each site to any of the other species found in the remaining sites. After each re-assignment of species identities, the multivariate method at stake is run and a parameter of interest is calculated. Consequently, one can estimate a range of plausible values for the parameter of interest under different scenarios of re-assigned species identities. We demonstrate the use of our approach in the calculation of two parameters with an example involving tropical tree species from western Amazonia: 1) the Mantel correlation between compositional similarity and environmental distances between pairs of sites, and; 2) the variance explained by environmental predictors in redundancy analysis (RDA). We also investigated the effects of increasing taxonomic uncertainty (i.e. number of unidentified species), and the taxonomic resolution at which morphospecies are determined (genus-resolution, family-resolution, or fully undetermined species) on the uncertainty range of these parameters. To achieve this, we performed simulations on a tree dataset from southern Mexico by randomly selecting a portion of the species contained in the dataset and classifying them as unidentified at each level of decreasing taxonomic resolution. An analysis of covariance showed that both taxonomic uncertainty and resolution significantly influence the uncertainty range of the resulting parameters. Increasing taxonomic uncertainty expands our uncertainty of the parameters estimated both in the Mantel test and RDA. The effects of increasing taxonomic resolution, however, are not as evident. The method presented in this study improves the traditional approaches to study compositional change in ecological communities by accounting for some of the uncertainty inherent to biological data. We hope that this approach can be routinely used to estimate any parameter of interest obtained from compositional data tables when faced with taxonomic uncertainty.
Resumo:
Accurate control over the spent nuclear fuel content is essential for its safe and optimized transportation, storage and management. Consequently, the reactivity of spent fuel and its isotopic content must be accurately determined. Nowadays, to predict isotopic evolution throughout irradiation and decay periods is not a problem thanks to the development of powerful codes and methodologies. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, it is desirable to determine how uncertainties in the basic nuclear data affect isotopic prediction calculations by quantifying their associated uncertainties
Resumo:
The assessment of the accuracy of parameters related to the reactor core performance (e.g., ke) and f el cycle (e.g., isotopic evolution/transmutation) due to the uncertainties in the basic nuclear data (ND) is a critical issue. Different error propagation techniques (adjoint/forward sensitivity analysis procedures and/or Monte Carlo technique) can be used to address by computational simulation the systematic propagation of uncertainties on the final parameters. To perform this uncertainty assessment, the ENDF covariance les (variance/correlation in energy and cross- reactions-isotopes correlations) are required. In this paper, we assess the impact of ND uncertainties on the isotopic prediction for a conceptual design of a modular European Facility for Industrial Transmutation (EFIT) for a discharge burnup of 150 GWd/tHM. The complete set of uncertainty data for cross sections (EAF2007/UN, SCALE6.0/COVA-44G), radioactive decay and fission yield data (JEFF-3.1.1) are processed and used in ACAB code.
Resumo:
To study the propagation of the uncertainty from basic data across different scale and physics phenomena -> through complex coupled multi-physics and multi-scale simulations
Resumo:
Accurate control over the spent nuclear fuel content is essential for its safe and optimized transportation, storage and management. Consequently, the reactivity of spent fuel and its isotopic content must be accurately determined.
Resumo:
The uncertainty propagation in fuel cycle calculations due to Nuclear Data (ND) is a important important issue for : issue for : • Present fuel cycles (e.g. high burnup fuel programme) • New fuel cycles designs (e.g. fast breeder reactors and ADS) Different error propagation techniques can be used: • Sensitivity analysis • Response Response Surface Method Surface Method • Monte Carlo technique Then, p p , , in this paper, it is assessed the imp y pact of ND uncertainties on the decay heat and radiotoxicity in two applications: • Fission Pulse Decay ( y Heat calculation (FPDH) • Conceptual design of European Facility for Industrial Transmutation (EFIT)
Resumo:
A small Positron Emission Tomography demonstrator based on LYSO slabs and Silicon Photomultiplier matrices is under construction at the University and INFN of Pisa. In this paper we present the characterization results of the read-out electronics and of the detection system. Two SiPM matrices, composed by 8 × 8 SiPM pixels, 1.5 mm pitch, have been coupled one to one to a LYSO crystals array. Custom Front-End ASICs were used to read the 64 channels of each matrix. Data from each Front-End were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port. Specific tests were carried out on the system in order to assess its performance. Futhermore we have measured some of the most important parameters of the system for PET application.
Resumo:
An important competence of human data analysts is to interpret and explain the meaning of the results of data analysis to end-users. However, existing automatic solutions for intelligent data analysis provide limited help to interpret and communicate information to non-expert users. In this paper we present a general approach to generating explanatory descriptions about the meaning of quantitative sensor data. We propose a type of web application: a virtual newspaper with automatically generated news stories that describe the meaning of sensor data. This solution integrates a variety of techniques from intelligent data analysis into a web-based multimedia presentation system. We validated our approach in a real world problem and demonstrate its generality using data sets from several domains. Our experience shows that this solution can facilitate the use of sensor data by general users and, therefore, can increase the utility of sensor network infrastructures.
Resumo:
The Microarray technique is rather powerful, as it allows to test up thousands of genes at a time, but this produces an overwhelming set of data files containing huge amounts of data, which is quite difficult to pre-process, separate, classify and correlate for interesting conclusions to be extracted. Modern machine learning, data mining and clustering techniques based on information theory, are needed to read and interpret the information contents buried in those large data sets. Independent Component Analysis method can be used to correct the data affected by corruption processes or to filter the uncorrectable one and then clustering methods can group similar genes or classify samples. In this paper a hybrid approach is used to obtain a two way unsupervised clustering for a corrected microarray data.
Resumo:
La Directiva 2003/10/CE del Parlamento Europeo y del Consejo, del 6 de febrero de 2003, específica con arreglo al apartado 1 del artículo 16 de la Directiva 89/391/CEE las disposiciones mínimas de seguridad y de salud relativas a la exposición de los trabajadores a los riesgos derivados de los agentes físicos (ruido). En la industria musical, y en concreto en los músicos de orquesta, una exposición de más de ocho horas al día a un nivel de presión sonora de 80dB(A) o más es algo muy común. Esta situación puede causar a los trabajadores daños auditivos como la hiperacusia, hipoacusia, tinitus o ruptura de la membrana basilar entre otros. Esto significa que deben tomarse medidas para implementar las regulaciones de la forma más razonable posible para que la interpretación del músico, la dinámica y el concepto musical que se quiere transmitir al público se vea lo menos afectada posible. Para reducir la carga auditiva de los músicos de orquesta frente a fuertes impactos sonoros provenientes de los instrumentos vecinos, se está investigando sobre el uso de unos paneles acústicos que colocados en puntos estratégicos de la orquesta pueden llegar a reducir el impacto sonoro sobre el oído hasta 20dB. Los instrumentos de viento metal y de percusión son los responsables de la mayor emisión de presión sonora. Para proteger el oído de los músicos frente a estos impactos, se colocan los paneles en forma de barrera entre dichos instrumentos y los músicos colocados frente a ellos. De esta forma se protege el oído de los músicos más afectados. Para ver el efecto práctico que producen estos paneles en un conjunto orquestal, se realizan varias grabaciones en los ensayos y conciertos de varias orquestas. Los micrófonos se sitúan a la altura del oído y a una distancia de no más de 10cm de la oreja de varios de los músicos más afectados y de los músicos responsables de la fuerte emisión sonora. De este modo se puede hacer una comparación de los niveles de presión sonora que percibe cada músico y evaluar las diferencias de nivel existentes entre ambos. Así mismo se utilizan configuraciones variables de los paneles para comparar las diferencias de presión sonora que existen entre las distintas posibilidades de colocarlos y decidir así sobre la mejor ubicación y configuración de los mismos. A continuación, una vez obtenidos las muestras de audio y los diferentes archivos de datos medidos con un analizador de audio en distintas posiciones de la orquesta, todo ello se calibra y analiza utilizando un programa desarrollado en Matlab, para evaluar el efecto de los paneles sobre la percepción auditiva de los músicos, haciendo especial hincapié en el análisis de las diferencias de nivel de presión sonora (SPL). Mediante el cálculo de la envolvente de las diferencias de nivel, se evalúa de un modo estadístico el efecto de atenuación de los paneles acústicos en los músicos de orquesta. El método está basado en la probabilidad estadística de varias muestras musicales ya que al tratarse de música tocada en directo, la dinámica y la sincronización entre los músicos varía según el momento en que se toque. Estos factores junto con el hecho de que la partitura de cada músico es diferente dificulta la comparación entre dos señales grabadas en diferentes puntos de la orquesta. Se necesita por lo tanto de varias muestras musicales para evaluar el efecto de atenuación de los paneles en las distintas configuraciones mencionadas anteriormente. El estudio completo del efecto de los paneles como entorno que influye en los músicos de orquesta cuando están sobre el escenario, tiene como objetivo la mejora de sus condiciones de trabajo. Abstract For several years, the European Union has been adopting many laws and regulations to protect and give more security to people who are exposed to some risk in their job. Being exposed to a loud sound pressure level during many hours in the job runs the risk of hearing damage. Particularly in the field of music, the ear is the most important working tool. Not taking care of the ear can cause some damage such as hearing loss, tinnitus, hyperacusis, diplacusis, etc. This could have an impact on the efficiency and satisfaction of the musicians when they are playing, which could also cause stress problems. Orchestra musicians, as many other workers in this sector, are usually exposed to a sound level of 80dB(A) or more during more than eight hours per day. It means that they must satisfy the law and their legal obligations to avoid health problems proceeding from their job. Putting into practice the new regulations is a challenge for orchestras. They must make sure that the repertoire, with its dynamic, balance and feeling, is not affected by the reduction of sound levels imposed by the law. This study tries to investigate the benefits and disadvantages of using shields as a hearing protector during rehearsals and orchestral concerts.
Resumo:
The aim of this paper is to study the importance of nuclear data uncertainties in the prediction of the uncertainties in keff for LWR (Light Water Reactor) unit-cells. The first part of this work is focused on the comparison of different sensitivity/uncertainty propagation methodologies based on TSUNAMI and MCNP codes; this study is undertaken for a fresh-fuel at different operational conditions. The second part of this work studies the burnup effect where the indirect contribution due to the uncertainty of the isotopic evolution is also analyzed.
Resumo:
We are investigating the performances of a data acquisition system for Time of Flight PET, based on LYSO crystal slabs and 64 channels Silicon Photomultipliers matrices (1.2 cm2 of active area each). Measurements have been performed to test the timing capability of the detection system (SiPM matices coupled to a LYSO slab and the read-out electronics) with both test signal and radioactive source.
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50–100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.
Resumo:
Complex networks have been extensively used in the last decade to characterize and analyze complex systems, and they have been recently proposed as a novel instrument for the analysis of spectra extracted from biological samples. Yet, the high number of measurements composing spectra, and the consequent high computational cost, make a direct network analysis unfeasible. We here present a comparative analysis of three customary feature selection algorithms, including the binning of spectral data and the use of information theory metrics. Such algorithms are compared by assessing the score obtained in a classification task, where healthy subjects and people suffering from different types of cancers should be discriminated. Results indicate that a feature selection strategy based on Mutual Information outperforms the more classical data binning, while allowing a reduction of the dimensionality of the data set in two orders of magnitude
Resumo:
A basic requirement of the data acquisition systems used in long pulse fusion experiments is the real time physical events detection in signals. Developing such applications is usually a complex task, so it is necessary to develop a set of hardware and software tools that simplify their implementation. This type of applications can be implemented in ITER using fast controllers. ITER is standardizing the architectures to be used for fast controller implementation. Until now the standards chosen are PXIe architectures (based on PCIe) for the hardware and EPICS middleware for the software. This work presents the methodology for implementing data acquisition and pre-processing using FPGA-based DAQ cards and how to integrate these in fast controllers using EPICS.