922 resultados para Functional data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The object of this report is to present the data and conclusions drawn from the analysis of the origin and destination information. Comments on the advisability and correctness of the approach used by Iowa are encouraged.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New morpho-bathymetric and tectono-stratigraphic data on Naples and Salerno Gulfs, derived from bathymetric and seismic data analysis and integrated geologic interpretation are here presented. The CUBE(Combined Uncertainty Bathymetric Estimator) method has been applied to complex morphologies, such as the Capri continental slope and the related geological structures occurring in the Salerno Gulf.The bathymetric data analysis has been carried out for marine geological maps of the whole Campania continental margin at scales ranging from 1:25.000 to 1:10.000, including focused examples in Naples and Salerno Gulfs, Naples harbour, Capri and Ischia Islands and Salerno Valley. Seismic data analysis has allowed for the correlation of main morpho-structural lineaments recognized at a regional scale through multichannel profiles with morphological features cropping out at the sea bottom, evident from bathymetry.Main fault systems in the area have been represented on a tectonic sketch map, including the master fault located northwards to the Salerno Valley half graben. Some normal faults parallel to the master fault have been interpreted from the slope map derived from bathymetric data. A complex system of antithetic faults bound two morpho-structural highs located 20km to the south of the Capri Island. Some hints of compressional reactivation of normal faults in an extensional setting involving the whole Campania continental margin have been shown from seismic interpretation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind-generated waves in the Kara, Laptev, and East-Siberian Seas are investigated using altimeter data from Envisat RA-2 and SARAL-AltiKa. Only isolated ice-free zones had been selected for analysis. Wind seas can be treated as pure wind-generated waves without any contamination by ambient swell. Such zones were identified using ice concentration data from microwave radiometers. Altimeter data, both significant wave height (SWH) and wind speed, for these areas were further obtained for the period 2002-2012 using Envisat RA-2 measurements, and for 2013 using SARAL-AltiKa. Dependencies of dimensionless SWH and wavelength on dimensionless wave generation spatial scale are compared to known empirical dependencies for fetch-limited wind wave development. We further check sensitivity of Ka- and Ku-band and discuss new possibilities that AltiKa's higher resolution can open.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This text is taken from the postgraduate thesis, which one of the authors (A.B.) developed for the degree of Medical Physicist in the School on Medical Physics of the University of Florence. The text explores the feasibility of quantitative Magnetic Resonance Spectroscopy as a tool for daily clinical routine use. The results and analysis comes from two types of hyper spectral images: the first set are hyper spectral images coming from a standard phantom (reference images); and hyper spectral images obtained from a group of patients who have undergone MRI examinations at the Santa Maria Nuova Hospital. This interdisciplinary work stems from the IFAC-CNR know how in terms of data analysis and nanomedicine, and the clinical expertise of Radiologists and Medical Physicists. The results reported here, which were the subject of the thesis, are original, unpublished, and represent independent work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we further extend the recently developed adaptive data analysis method, the Sparse Time-Frequency Representation (STFR) method. This method is based on the assumption that many physical signals inherently contain AM-FM representations. We propose a sparse optimization method to extract the AM-FM representations of such signals. We prove the convergence of the method for periodic signals under certain assumptions and provide practical algorithms specifically for the non-periodic STFR, which extends the method to tackle problems that former STFR methods could not handle, including stability to noise and non-periodic data analysis. This is a significant improvement since many adaptive and non-adaptive signal processing methods are not fully capable of handling non-periodic signals. Moreover, we propose a new STFR algorithm to study intrawave signals with strong frequency modulation and analyze the convergence of this new algorithm for periodic signals. Such signals have previously remained a bottleneck for all signal processing methods. Furthermore, we propose a modified version of STFR that facilitates the extraction of intrawaves that have overlaping frequency content. We show that the STFR methods can be applied to the realm of dynamical systems and cardiovascular signals. In particular, we present a simplified and modified version of the STFR algorithm that is potentially useful for the diagnosis of some cardiovascular diseases. We further explain some preliminary work on the nature of Intrinsic Mode Functions (IMFs) and how they can have different representations in different phase coordinates. This analysis shows that the uncertainty principle is fundamental to all oscillating signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comunicação apresentada na 44th SEFI Conference, 12-­15 September 2016, Tampere, Finland

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyzing large-scale gene expression data is a labor-intensive and time-consuming process. To make data analysis easier, we developed a set of pipelines for rapid processing and analysis poplar gene expression data for knowledge discovery. Of all pipelines developed, differentially expressed genes (DEGs) pipeline is the one designed to identify biologically important genes that are differentially expressed in one of multiple time points for conditions. Pathway analysis pipeline was designed to identify the differentially expression metabolic pathways. Protein domain enrichment pipeline can identify the enriched protein domains present in the DEGs. Finally, Gene Ontology (GO) enrichment analysis pipeline was developed to identify the enriched GO terms in the DEGs. Our pipeline tools can analyze both microarray gene data and high-throughput gene data. These two types of data are obtained by two different technologies. A microarray technology is to measure gene expression levels via microarray chips, a collection of microscopic DNA spots attached to a solid (glass) surface, whereas high throughput sequencing, also called as the next-generation sequencing, is a new technology to measure gene expression levels by directly sequencing mRNAs, and obtaining each mRNA’s copy numbers in cells or tissues. We also developed a web portal (http://sys.bio.mtu.edu/) to make all pipelines available to public to facilitate users to analyze their gene expression data. In addition to the analyses mentioned above, it can also perform GO hierarchy analysis, i.e. construct GO trees using a list of GO terms as an input.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hadrontherapy employs high-energy beams of charged particles (protons and heavier ions) to treat deep-seated tumours: these particles have a favourable depth-dose distribution in tissue characterized by a low dose in the entrance channel and a sharp maximum (Bragg peak) near the end of their path. In these treatments nuclear interactions have to be considered: beam particles can fragment in the human body releasing a non-zero dose beyond the Bragg peak while fragments of human body nuclei can modify the dose released in healthy tissues. These effects are still in question given the lack of interesting cross sections data. Also space radioprotection can profit by fragmentation cross section measurements: the interest in long-term manned space missions beyond Low Earth Orbit is growing in these years but it has to cope with major health risks due to space radiation. To this end, risk models are under study: however, huge gaps in fragmentation cross sections data are currently present preventing an accurate benchmark of deterministic and Monte Carlo codes. To fill these gaps in data, the FOOT (FragmentatiOn Of Target) experiment was proposed. It is composed by two independent and complementary setups, an Emulsion Cloud Chamber and an electronic setup composed by several subdetectors providing redundant measurements of kinematic properties of fragments produced in nuclear interactions between a beam and a target. FOOT aims to measure double differential cross sections both in angle and kinetic energy which is the most complete information to address existing questions. In this Ph.D. thesis, the development of the Trigger and Data Acquisition system for the FOOT electronic setup and a first analysis of 400 MeV/u 16O beam on Carbon target data acquired in July 2021 at GSI (Darmstadt, Germany) are presented. When possible, a comparison with other available measurements is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

LHC experiments produce an enormous amount of data, estimated of the order of a few PetaBytes per year. Data management takes place using the Worldwide LHC Computing Grid (WLCG) grid infrastructure, both for storage and processing operations. However, in recent years, many more resources are available on High Performance Computing (HPC) farms, which generally have many computing nodes with a high number of processors. Large collaborations are working to use these resources in the most efficient way, compatibly with the constraints imposed by computing models (data distributed on the Grid, authentication, software dependencies, etc.). The aim of this thesis project is to develop a software framework that allows users to process a typical data analysis workflow of the ATLAS experiment on HPC systems. The developed analysis framework shall be deployed on the computing resources of the Open Physics Hub project and on the CINECA Marconi100 cluster, in view of the switch-on of the Leonardo supercomputer, foreseen in 2023.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il rilevatore Probe for LUminosity MEasurement (PLUME) è un luminometro per l’esperimento LHCb al CERN. Fornirà misurazioni istantanee della luminosità per LHCb durante la Run 3 a LHC. L’obiettivo di questa tesi è di valutare, con dati simulati, le prestazioni attese di PLUME, come l’occupanza dei PMT che compongono il rivelatore, e riportare l’analisi dei primi dati ottenuti da PLUME durante uno scan di Van der Meer. In particolare, sono state ottenuti tre misure del valore della sezione d’urto, necessarie per tarare il rivelatore, ovvero σ1Da = (1.14 ± 0.11) mb, σ1Db = (1.13 ± 0.10) mb, σ2D = (1.20 ± 0.02) mb, dove i pedici 1D e 2D corrispondono a uno scan di Van der Meer unidimensionale e bidimensionale. Tutti i risultati sono in accordo tra loro.