893 resultados para data analysis: algorithms and implementation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis builds a framework for evaluating downside risk from multivariate data via a special class of risk measures (RM). The peculiarity of the analysis lies in getting rid of strong data distributional assumptions and in orientation towards the most critical data in risk management: those with asymmetries and heavy tails. At the same time, under typical assumptions, such as the ellipticity of the data probability distribution, the conformity with classical methods is shown. The constructed class of RM is a multivariate generalization of the coherent distortion RM, which possess valuable properties for a risk manager. The design of the framework is twofold. The first part contains new computational geometry methods for the high-dimensional data. The developed algorithms demonstrate computability of geometrical concepts used for constructing the RM. These concepts bring visuality and simplify interpretation of the RM. The second part develops models for applying the framework to actual problems. The spectrum of applications varies from robust portfolio selection up to broader spheres, such as stochastic conic optimization with risk constraints or supervised machine learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares the performance of the complex nonlinear least squares algorithm implemented in the LEVM/LEVMW software with the performance of a genetic algorithm in the characterization of an electrical impedance of known topology. The effect of the number of measured frequency points and of measurement uncertainty on the estimation of circuit parameters is presented. The analysis is performed on the equivalent circuit impedance of a humidity sensor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reinforcement Learning is an increasingly popular area of Artificial Intelligence. The applications of this learning paradigm are many, but its application in mobile computing is in its infancy. This study aims to provide an overview of current Reinforcement Learning applications on mobile devices, as well as to introduce a new framework for iOS devices: Swift-RL Lib. This new Swift package allows developers to easily support and integrate two of the most common RL algorithms, Q-Learning and Deep Q-Network, in a fully customizable environment. All processes are performed on the device, without any need for remote computation. The framework was tested in different settings and evaluated through several use cases. Through an in-depth performance analysis, we show that the platform provides effective and efficient support for Reinforcement Learning for mobile applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In higher education, undergraduate teaching materials are increasingly becoming available online. There is a need to understand the complex processes that happen during their production and how social networks between different groups impact on their development. This paper draws on qualitative interviews and participant drawings of their social networks to understand the dynamics of creating a new e-compendium for a four-year online undergraduate nursing programme in Norway. Twenty staff interviews were undertaken to explore views of the e-compendium, the development process and the perceived networks that were formed during this course. Interview data were thematically analysed along with networks drawings. The findings showed three main institutional stakeholder groups emerging: the ‘management team’, ‘design team’ and ‘lecturers’. Analysis of social networks revealed variability of relations both within and between groups. The pedagogical designer, who was part of the design team, was central to communicating with and co-ordinating staff at all levels. The least well connected were the lecturers. To them, the e-compendium challenged and even threatened previously well-established notions of pedagogy. Future development of e-compendiums should account for the perceived lack of time and existing workload of lecturers so they may be involved with the development process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hadrontherapy employs high-energy beams of charged particles (protons and heavier ions) to treat deep-seated tumours: these particles have a favourable depth-dose distribution in tissue characterized by a low dose in the entrance channel and a sharp maximum (Bragg peak) near the end of their path. In these treatments nuclear interactions have to be considered: beam particles can fragment in the human body releasing a non-zero dose beyond the Bragg peak while fragments of human body nuclei can modify the dose released in healthy tissues. These effects are still in question given the lack of interesting cross sections data. Also space radioprotection can profit by fragmentation cross section measurements: the interest in long-term manned space missions beyond Low Earth Orbit is growing in these years but it has to cope with major health risks due to space radiation. To this end, risk models are under study: however, huge gaps in fragmentation cross sections data are currently present preventing an accurate benchmark of deterministic and Monte Carlo codes. To fill these gaps in data, the FOOT (FragmentatiOn Of Target) experiment was proposed. It is composed by two independent and complementary setups, an Emulsion Cloud Chamber and an electronic setup composed by several subdetectors providing redundant measurements of kinematic properties of fragments produced in nuclear interactions between a beam and a target. FOOT aims to measure double differential cross sections both in angle and kinetic energy which is the most complete information to address existing questions. In this Ph.D. thesis, the development of the Trigger and Data Acquisition system for the FOOT electronic setup and a first analysis of 400 MeV/u 16O beam on Carbon target data acquired in July 2021 at GSI (Darmstadt, Germany) are presented. When possible, a comparison with other available measurements is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il rilevatore Probe for LUminosity MEasurement (PLUME) è un luminometro per l’esperimento LHCb al CERN. Fornirà misurazioni istantanee della luminosità per LHCb durante la Run 3 a LHC. L’obiettivo di questa tesi è di valutare, con dati simulati, le prestazioni attese di PLUME, come l’occupanza dei PMT che compongono il rivelatore, e riportare l’analisi dei primi dati ottenuti da PLUME durante uno scan di Van der Meer. In particolare, sono state ottenuti tre misure del valore della sezione d’urto, necessarie per tarare il rivelatore, ovvero σ1Da = (1.14 ± 0.11) mb, σ1Db = (1.13 ± 0.10) mb, σ2D = (1.20 ± 0.02) mb, dove i pedici 1D e 2D corrispondono a uno scan di Van der Meer unidimensionale e bidimensionale. Tutti i risultati sono in accordo tra loro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis is the result of work conducted during a period of six months at the Strategy department of Automobili Lamborghini S.p.A. in Sant'Agata Bolognese (BO) and concerns the study and analysis of Big Data relating to Lamborghini's connected cars. The Big Data is a project of Connected Car Project House, that is an inter-departmental team which works toward the definition of the Lamborghini corporate connectivity strategy and its implementation in the product portfolio. The Data of the connected cars is one of the hottest topics right now in the automotive industry; in fact, all the largest automotive companies are investi,ng a lot in this direction, in order to derive the greatest advantages both from a purely economic point of view, because from these data you can understand a lot the behaviors and habits of each driver, and from a technological point of view because it will increasingly promote the development of 5G that will be an important enabler for the future of connectivity. The main purpose of the work by Lamborghini prospective is to analyze the data of the connected cars, in particular a data-set referred to connected Huracans that had been already placed on the market, and, starting from that point, derive valuable Key Performance Indicators (KPIs) on which the company could partly base the decisions to be made in the near future. The key result that we have obtained at the end of this period was the creation of a Dashboard, in which is possible to visualize many parameters and indicators both related to driving habits and the use of the vehicle itself, which has brought great insights on the huge potential and value that is present behind the study of these data. The final Demo of the project has received great interest, not only from the whole strategy department but also from all the other business areas of Lamborghini, making mostly a great awareness that this will be the road to follow in the coming years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the development of new technologies, Air Traffic Control, in the nearby of the airport, switched from a purely visual control to the use of radar, sensors and so on. As the industry is switching to the so-called Industry 4.0, also in this frame, it would be possible to implement some of the new tools that can facilitate the work of Air Traffic Controllers. The European Union proposed an innovative project to help the digitalization of the European Sky by means of the Single European Sky ATM Research (SESAR) program, which is the foundation on which the Single European Sky (SES) is based, in order to improve the already existing technologies to transform Air Traffic Management in Europe. Within this frame, the Resilient Synthetic Vision for Advanced Control Tower Air Navigation Service Provision (RETINA) project, which saw the light in 2016, studied the possibility to apply new tools within the conventional control tower to reduce the air traffic controller workload, thanks to the improvements in the augmented reality technologies. After the validation of RETINA, the Digital Technologies for Tower (DTT) project was established and the solution proposed by the University of Bologna aimed, among other things, to introduce Safety Nets in a Head-Up visualization. The aim of this thesis is to analyze the Safety Nets in use within the control tower and, by developing a working concept, implement them in a Head-Up view to be tested by Air Traffic Control Operators (ATCOs). The results, coming from the technical test, show that this concept is working and it could be leading to a future implementation in a real environment, as it improves the air traffic controller working conditions also when low visibility conditions apply.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial companies, particularly those with induction motors and gearboxes as integral components of their systems, are utilizing Condition Monitoring (CM) systems more frequently in order to discover the need for maintenance in advance, as traditional maintenance only performs tasks when a failure has been identified. Utilizing a CM system is essential to boost productivity and minimize long-term failures that result in financial loss. The more exact and practical the CM system, the better the data analysis, which adds to a more precise maintenance forecast. This thesis project is a cooperation with PEI Vibration Monitoring s.r.l. to design and construct a low-cost vibrational condition monitoring system to check the health of induction motors and gearboxes automatically. Moreover, according to the company's request, such a system should have specs comparable to NI 9234, one of the company's standard Data Acquisition (DAQ) boards, but at a significantly cheaper price. Additionally, PEI VM Company has supplied all hardware and electronic components. The suggested CM system is capable of highprecision autonomous monitoring of induction motors and gearboxes, and it consists of a Raspberry Pi 3B and MCC 172 DAQ board.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are many natural events that can negatively affect the urban ecosystem, but weather-climate variations are certainly among the most significant. The history of settlements has been characterized by extreme events like earthquakes and floods, which repeat themselves at different times, causing extensive damage to the built heritage on a structural and urban scale. Changes in climate also alter various climatic subsystems, changing rainfall regimes and hydrological cycles, increasing the frequency and intensity of extreme precipitation events (heavy rainfall).  From an hydrological risk perspective, it is crucial to understand future events that could occur and their magnitude in order to design safer infrastructures. Unfortunately, it is not easy to understand future scenarios as the complexity of climate is enormous.  For this thesis, precipitation and discharge extremes were primarily used as data sources. It is important to underline that the two data sets are not separated: changes in rainfall regime, due to climate change, could significantly affect overflows into receiving water bodies. It is imperative that we understand and model climate change effects on water structures to support the development of adaptation strategies.   The main purpose of this thesis is to search for suitable water structures for a road located along the Tione River. Therefore, through the analysis of the area from a hydrological point of view, we aim to guarantee the safety of the infrastructure over time.   The observations made have the purpose to underline how models such as a stochastic one can improve the quality of an analysis for design purposes, and influence choices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information on fruits and vegetables consumption in Brazil in the three levels of dietary data was analyzed and compared. Data about national supply came from Food Balance Sheets compiled by the FAO; household availability information was obtained from the Brazilian National Household Budget Survey (HBS); and actual intake information came from a large individual dietary intake survey that was representative of the adult population of São Paulo city. All sources of information were collected between 2002 and 2003. A subset of the HBS, representative of São Paulo city, was used in our analysis in order to improve the quality of the comparison with actual intake data. The ratio of national supply to household availability of fruits and vegetables was 2.6 while the ratio of national supply to actual intake was 4.0. The discrepancy ratio in the comparison between household availability and actual intake was smaller, 1.6. While the use of supply and availability data has advantages, as lower cost, must be taken into account that these sources tend to overestimate actual intake of fruits and vegetables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A combination of deductive reasoning, clustering, and inductive learning is given as an example of a hybrid system for exploratory data analysis. Visualization is replaced by a dialogue with the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of three analytical methods for multiple-frequency bioelectrical impedance analysis (MFBIA) data was assessed. The methods were the established method of Cole and Cole, the newly proposed method of Siconolfi and co-workers and a modification of this procedure. Method performance was assessed from the adequacy of the curve fitting techniques, as judged by the correlation coefficient and standard error of the estimate, and the accuracy of the different methods in determining the theoretical values of impedance parameters describing a set of model electrical circuits. The experimental data were well fitted by all curve-fitting procedures (r = 0.9 with SEE 0.3 to 3.5% or better for most circuit-procedure combinations). Cole-Cole modelling provided the most accurate estimates of circuit impedance values, generally within 1-2% of the theoretical values, followed by the Siconolfi procedure using a sixth-order polynomial regression (1-6% variation). None of the methods, however, accurately estimated circuit parameters when the measured impedances were low (<20 Omega) reflecting the electronic limits of the impedance meter used. These data suggest that Cole-Cole modelling remains the preferred method for the analysis of MFBIA data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.