914 resultados para Dynamic data analysis
Resumo:
Il rilevatore Probe for LUminosity MEasurement (PLUME) è un luminometro per l’esperimento LHCb al CERN. Fornirà misurazioni istantanee della luminosità per LHCb durante la Run 3 a LHC. L’obiettivo di questa tesi è di valutare, con dati simulati, le prestazioni attese di PLUME, come l’occupanza dei PMT che compongono il rivelatore, e riportare l’analisi dei primi dati ottenuti da PLUME durante uno scan di Van der Meer. In particolare, sono state ottenuti tre misure del valore della sezione d’urto, necessarie per tarare il rivelatore, ovvero σ1Da = (1.14 ± 0.11) mb, σ1Db = (1.13 ± 0.10) mb, σ2D = (1.20 ± 0.02) mb, dove i pedici 1D e 2D corrispondono a uno scan di Van der Meer unidimensionale e bidimensionale. Tutti i risultati sono in accordo tra loro.
Resumo:
The thesis is the result of work conducted during a period of six months at the Strategy department of Automobili Lamborghini S.p.A. in Sant'Agata Bolognese (BO) and concerns the study and analysis of Big Data relating to Lamborghini's connected cars. The Big Data is a project of Connected Car Project House, that is an inter-departmental team which works toward the definition of the Lamborghini corporate connectivity strategy and its implementation in the product portfolio. The Data of the connected cars is one of the hottest topics right now in the automotive industry; in fact, all the largest automotive companies are investi,ng a lot in this direction, in order to derive the greatest advantages both from a purely economic point of view, because from these data you can understand a lot the behaviors and habits of each driver, and from a technological point of view because it will increasingly promote the development of 5G that will be an important enabler for the future of connectivity. The main purpose of the work by Lamborghini prospective is to analyze the data of the connected cars, in particular a data-set referred to connected Huracans that had been already placed on the market, and, starting from that point, derive valuable Key Performance Indicators (KPIs) on which the company could partly base the decisions to be made in the near future. The key result that we have obtained at the end of this period was the creation of a Dashboard, in which is possible to visualize many parameters and indicators both related to driving habits and the use of the vehicle itself, which has brought great insights on the huge potential and value that is present behind the study of these data. The final Demo of the project has received great interest, not only from the whole strategy department but also from all the other business areas of Lamborghini, making mostly a great awareness that this will be the road to follow in the coming years.
Resumo:
I principi Agile, pubblicati nell’omonimo Manifesto più di 20 anni fa, al giorno d’oggi sono declinati in una moltitudine di framework: Scrum, XP, Kanban, Lean, Adaptive, Crystal, etc. Nella prima parte della tesi (Capitoli 1 e 2) sono stati descritti alcuni di questi framework e si è analizzato come un approccio Agile è utilizzato nella pratica in uno specifico caso d’uso: lo sviluppo di una piattaforma software a supporto di un sistema di e-grocery da parte di un team di lab51. Si sono verificate le differenze e le similitudini rispetto alcuni metodi Agile formalizzati in letteratura spiegando le motivazioni che hanno portato a differenziarsi da questi framework illustrando i vantaggi per il team. Nella seconda parte della tesi (Capitoli 3 e 4) è stata effettuata un’analisi dei dati raccolti dal supermercato online negli ultimi anni con l’obiettivo di migliorare l’algoritmo di riordino. In particolare, per prevedere le vendite dei singoli prodotti al fine di avere degli ordini più adeguati in quantità e frequenza, sono stati studiati vari approcci: dai modelli statistici di time series forecasting, alle reti neurali, fino ad una metodologia sviluppata ad hoc.
Resumo:
There are many natural events that can negatively affect the urban ecosystem, but weather-climate variations are certainly among the most significant. The history of settlements has been characterized by extreme events like earthquakes and floods, which repeat themselves at different times, causing extensive damage to the built heritage on a structural and urban scale. Changes in climate also alter various climatic subsystems, changing rainfall regimes and hydrological cycles, increasing the frequency and intensity of extreme precipitation events (heavy rainfall). From an hydrological risk perspective, it is crucial to understand future events that could occur and their magnitude in order to design safer infrastructures. Unfortunately, it is not easy to understand future scenarios as the complexity of climate is enormous. For this thesis, precipitation and discharge extremes were primarily used as data sources. It is important to underline that the two data sets are not separated: changes in rainfall regime, due to climate change, could significantly affect overflows into receiving water bodies. It is imperative that we understand and model climate change effects on water structures to support the development of adaptation strategies. The main purpose of this thesis is to search for suitable water structures for a road located along the Tione River. Therefore, through the analysis of the area from a hydrological point of view, we aim to guarantee the safety of the infrastructure over time. The observations made have the purpose to underline how models such as a stochastic one can improve the quality of an analysis for design purposes, and influence choices.
Dynamic Changes in the Mental Rotation Network Revealed by Pattern Recognition Analysis of fMRI Data
Resumo:
We investigated the temporal dynamics and changes in connectivity in the mental rotation network through the application of spatio-temporal support vector machines (SVMs). The spatio-temporal SVM [Mourao-Miranda, J., Friston, K. J., et al. (2007). Dynamic discrimination analysis: A spatial-temporal SVM. Neuroimage, 36, 88-99] is a pattern recognition approach that is suitable for investigating dynamic changes in the brain network during a complex mental task. It does not require a model describing each component of the task and the precise shape of the BOLD impulse response. By defining a time window including a cognitive event, one can use spatio-temporal fMRI observations from two cognitive states to train the SVM. During the training, the SVM finds the discriminating pattern between the two states and produces a discriminating weight vector encompassing both voxels and time (i.e., spatio-temporal maps). We showed that by applying spatio-temporal SVM to an event-related mental rotation experiment, it is possible to discriminate between different degrees of angular disparity (0 degrees vs. 20 degrees, 0 degrees vs. 60 degrees, and 0 degrees vs. 100 degrees), and the discrimination accuracy is correlated with the difference in angular disparity between the conditions. For the comparison with highest accuracy (08 vs. 1008), we evaluated how the most discriminating areas (visual regions, parietal regions, supplementary, and premotor areas) change their behavior over time. The frontal premotor regions became highly discriminating earlier than the superior parietal cortex. There seems to be a parcellation of the parietal regions with an earlier discrimination of the inferior parietal lobe in the mental rotation in relation to the superior parietal. The SVM also identified a network of regions that had a decrease in BOLD responses during the 100 degrees condition in relation to the 0 degrees condition (posterior cingulate, frontal, and superior temporal gyrus). This network was also highly discriminating between the two conditions. In addition, we investigated changes in functional connectivity between the most discriminating areas identified by the spatio-temporal SVM. We observed an increase in functional connectivity between almost all areas activated during the 100 degrees condition (bilateral inferior and superior parietal lobe, bilateral premotor area, and SMA) but not between the areas that showed a decrease in BOLD response during the 100 degrees condition.
Resumo:
Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.
Resumo:
With the outlook of improving seismic vulnerability assessment for the city of Bishkek (Kyrgyzstan), the global dynamic behaviour of four nine-storey r.c. large-panel buildings in elastic regime is studied. The four buildings were built during the Soviet era within a serial production system. Since they all belong to the same series, they have very similar geometries both in plan and in height. Firstly, ambient vibration measurements are performed in the four buildings. The data analysis composed of discrete Fourier transform, modal analysis (frequency domain decomposition) and deconvolution interferometry, yields the modal characteristics and an estimate of the linear impulse response function for the structures of the four buildings. Then, finite element models are set up for all four buildings and the results of the numerical modal analysis are compared with the experimental ones. The numerical models are finally calibrated considering the first three global modes and their results match the experimental ones with an error of less then 20%.
Resumo:
The objective of this retrospective study is to follow up on a previous Dynamic Smile Analysis and videographically analyze and develop averages for soft tissue norms with respect to the display of dentition during speech. These values would then be compared cross-sectionally across different age groups to see whether changes attributable to the aging process could be seen. A secondary objective was to compare averages for soft tissue norms in the display of dentition during speech to averages for soft tissue norms in the display of dentition during the smile. Materials and Method: Records from a previous study in which video equipment was used to capture video for 26 1 subjects were re-evaluated to find appropriate frames to analyze for speech. Two frames for each subject were selected; one frame representing the maximal display of maxillary incisors during speech and the second representing the widest transverse display of dentition during speech. After excluding 40 subjects the data for the remaining 221 subjects was analyzed. These averages were then compared to averages attained in the previous study to compare the display of the dentition during speech to the display of the dentition during smile. Results: On average, a difference in 1.29 mm was seen in the display of the maxillary incisors during speech at maximal display and during the smile. An average of 7.23 mm of maxillary incisors is readily visible during maximum display of maxillary incisors during speech, as compared to 8.52 mm during the smile. The constructed smile index was also smaller when measured during the speech when compared to the smile index by an average of 2.58 units. Conclusion: This study helps to establish age-related dynamic norms for the display of dentition during speech. The dynamic measures indicate that the display of dectition is greater, on average, during the smile than at speech.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Min/max autocorrelation factor analysis (MAFA) and dynamic factor analysis (DFA) are complementary techniques for analysing short (> 15-25 y), non-stationary, multivariate data sets. We illustrate the two techniques using catch rate (cpue) time-series (1982-2001) for 17 species caught during trawl surveys off Mauritania, with the NAO index, an upwelling index, sea surface temperature, and an index of fishing effort as explanatory variables. Both techniques gave coherent results, the most important common trend being a decrease in cpue during the latter half of the time-series, and the next important being an increase during the first half. A DFA model with SST and UPW as explanatory variables and two common trends gave good fits to most of the cpue time-series. (c) 2004 International Council for the Exploration of the Sea. Published by Elsevier Ltd. All rights reserved.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The increasing availability of mobility data and the awareness of its importance and value have been motivating many researchers to the development of models and tools for analyzing movement data. This paper presents a brief survey of significant research works about modeling, processing and visualization of data about moving objects. We identified some key research fields that will provide better features for online analysis of movement data. As result of the literature review, we suggest a generic multi-layer architecture for the development of an online analysis processing software tool, which will be used for the definition of the future work of our team.
Resumo:
3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.