699 resultados para data warehouse tuning aggregato business intelligence performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Ciências Cartográficas - FCT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complexity of biological samples poses a major challenge for reliable compound identification in mass spectrometry (MS). The presence of interfering compounds that cause additional peaks in the spectrum can make interpretation and assignment difficult. To overcome this issue, new approaches are needed to reduce complexity and simplify spectral interpretation. Recently, focused on unknown metabolite identification, we presented a new approach, RANSY (ratio analysis of nuclear magnetic resonance spectroscopy; Anal. Chem. 2011, 83, 7616-7623), which extracts the signals related to the same metabolite based on peak intensity ratios. On the basis of this concept, we present the ratio analysis of mass spectrometry (RAMSY) method, which facilitates improved compound identification in complex MS spectra. RAMSY works on the principle that, under a given set of experimental conditions, the abundance/intensity ratios between the mass fragments from the same metabolite are relatively constant. Therefore, the quotients of average peak ratios and their standard deviations, generated using a small set of MS spectra from the same ion chromatogram, efficiently allow the statistical recovery of the metabolite peaks and facilitate reliable identification. RAMSY was applied to both gas chromatography/MS and liquid chromatography tandem MS (LC-MS/MS) data to demonstrate its utility. The performance of RAMSY is typically better than the results from correlation methods. RAMSY promises to improve unknown metabolite identification for MS users in metabolomics or other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Box-Cox transformation is a technique mostly utilized to turn the probabilistic distribution of a time series data into approximately normal. And this helps statistical and neural models to perform more accurate forecastings. However, it introduces a bias when the reversion of the transformation is conducted with the predicted data. The statistical methods to perform a bias-free reversion require, necessarily, the assumption of Gaussianity of the transformed data distribution, which is a rare event in real-world time series. So, the aim of this study was to provide an effective method of removing the bias when the reversion of the Box-Cox transformation is executed. Thus, the developed method is based on a focused time lagged feedforward neural network, which does not require any assumption about the transformed data distribution. Therefore, to evaluate the performance of the proposed method, numerical simulations were conducted and the Mean Absolute Percentage Error, the Theil Inequality Index and the Signal-to-Noise ratio of 20-step-ahead forecasts of 40 time series were compared, and the results obtained indicate that the proposed reversion method is valid and justifies new studies. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Informação - FFC

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The wide variety of molecular architectures used in sensors and biosensors and the large amount of data generated with some principles of detection have motivated the use of computational methods, such as information visualization techniques, not only to handle the data but also to optimize sensing performance. In this study, we combine projection techniques with micro-Raman scattering and atomic force microscopy (AFM) to address critical issues related to practical applications of electronic tongues (e-tongues) based on impedance spectroscopy. Experimentally, we used sensing units made with thin films of a perylene derivative (AzoPTCD acronym), coating Pt interdigitated electrodes, to detect CuCl(2) (Cu(2+)), methylene blue (MB), and saccharose in aqueous solutions, which were selected due to their distinct molecular sizes and ionic character in solution. The AzoPTCD films were deposited from monolayers to 120 nm via Langmuir-Blodgett (LB) and physical vapor deposition (PVD) techniques. Because the main aspects investigated were how the interdigitated electrodes are coated by thin films (architecture on e-tongue) and the film thickness, we decided to employ the same material for all sensing units. The capacitance data were projected into a 2D plot using the force scheme method, from which we could infer that at low analyte concentrations the electrical response of the units was determined by the film thickness. Concentrations at 10 mu M or higher could be distinguished with thinner films tens of nanometers at most-which could withstand the impedance measurements, and without causing significant changes in the Raman signal for the AzoPTCD film-forming molecules. The sensitivity to the analytes appears to be related to adsorption on the film surface, as inferred from Raman spectroscopy data using MB as analyte and from the multidimensional projections. The analysis of the results presented may serve as a new route to select materials and molecular architectures for novel sensors and biosensors, in addition to suggesting ways to unravel the mechanisms behind the high sensitivity obtained in various sensors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[ES] El objetivo de este Trabajo es el de parametrizar, implementar las estructuras de datos y programar las aplicaciones necesarias que posibilitan el intercambio de información entre dos entornos software, SAP R/3 y Knapp, líderes en sus campos de actuación. El resultado de aplicar tales cambios permitirá a la organización no sólo centralizar la información en el ERP, sino que mejorará sus procesos de negocio y agilizará la toma de decisiones por parte de los responsables. Se realiza un estudio de la situación actual y, tras un análisis detallado, se propone una solución que permita alcanzar los objetivos propuestos. Una vez diseñada, presentada y aprobada la propuesta, se procede a la parametrización de SAP R/3, a la definición de los segmentos y tipos de IDOC y a la codificación de funciones y programas que permitan tratar la información enviada por Knapp. Finalizadas estas tareas, se elaboran juegos de datos de los procesos comerciales y se ejecutan en un entorno de test, en colaboración con los usuarios claves, para comprobar la bondad de la solución implementada. Se analizan los resultados y se corrigen posibles deficiencias. Finalmente se transporta al sistema productivo todos los cambios realizados y se verifica la correcta ejecución de los procesos de negocio de la organización.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La presente Tesi di Laurea Specialistica considera, partendo da un'analisi della normativa vigente e delle procedure aziendali interne, il Sistema di Gestione Integrato Qualità  Sicurezza Ambiente (SGI QSA) di HERA SpA con particolare attenzione alle tematiche relative alla Prevenzione e Protezione sul luogo di lavoro in riferimento al Testo Unico sulla sicurezza (D.Lgs 81/2008) . Nello specifico, l'elaborato si basa sull'esperienza maturata durante cinque mesi di stage effettuati presso l'ufficio "Servizio Prevenzione e Protezione" della Struttura Operativa Territoriale (SOT) Bologna. Durante la mia permanenza in HERA SpA, ho avuto modo di osservare e prendere parte alle attività  quotidianamente svolte sia in ufficio che presso gli impianti dislocati nel territorio della provincia di Bologna con particolare riguardo alla raccolta, gestione e fruibilità  dei dai inerenti la sicurezza dei luoghi di lavoro. Nell'ambito dello stage, ho avuto anche la possibilità , estremamente formativa, di prendere visione dei processi, delle tecnologie e delle modalità  operative sottostanti l'erogazione di servizi da parte di una Multiutility; acquisire consapevolezza e know how in merito alle energie messe in campo per effettuare attività  quali la raccolta e lo smaltimento di rifiuti piuttosto che rendere disponibile alle utenze la fornitura di acqua e gas. Ritengo che questo possa darmi un valore aggiunto sia da un punto di vista professionale che da un punto di vista umano. Scopo primario di questa trattazione è effettuare l'istantanea di un'azienda complessa e in rapida evoluzione come HERA a partire della Salute e Sicurezza dei Lavoratori con l'obiettivo di indicare le attività  eseguite durante lo stage e il contributo fornito allo sviluppo e al mantenimento del SGS (Sistema di Gestione per la Salute e la sicurezza). Per meglio evidenziare la diversa natura delle informazioni riportate, l'elaborato risulta diviso in due parti fondamentali: La I PARTE riguarda lo studio della normativa che regola il settore con particolare riferimento al TUSL Testo Unico per la Sicurezza sui Luoghi di Lavoro (norma vigente in Italia) e allo standard britannico OHSAS 18001 a cui possono fare riferimento le organizzazioni che intendono certificare il proprio sistema di gestione in materia di sicurezza. In seguito si andranno ad analizzare le norme ISO 9001e ISO14001 che riguardano rispettivamente la possibilità  di certificare il proprio sistema di gestione in merito a Qualità  del servizio e tutela dell'Ambiente. Infine saranno proposte alcune riflessioni riguardanti la necessità  di sviluppare un sistema di gestione integrato e certificato che permetta di avere una visione unitaria di Qualità  Sicurezza e Ambiente. Nella II PARTE si entrerà  nel merito delle attività  svolte dall'ufficio Prevenzione e Protezione: a partire dalle procedure aziendali che fungono da punto di contatto fra gli obblighi normativi e la necessità  di regolare l'operatività  dei lavoratori, saranno descritte le mansioni che mi sono state affidate e le attività  svolte durante lo stage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oggi piu' che mai e' fondamentale essere in grado di estrarre informazioni rilevanti e conoscenza dal grande numero di dati che ci possono arrivare da svariati contesti, come database collegati a satelliti e sensori automatici, repository generati dagli utenti e data warehouse di grandi compagnie. Una delle sfide attuali riguarda lo sviluppo di tecniche di data mining per la gestione dell’incertezza. L’obiettivo di questa tesi e' di estendere le attuali tecniche di gestione dell’incertezza, in particolare riguardanti la classificazione tramite alberi decisionali, in maniera tale da poter gestire incertezza anche sull’attributo di classe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’obiettivo di questa tesi è approfondire le competenze sulle funzionalità sviluppate nei sistemi SCADA/EMS presenti sul mercato, così da conoscerne le potenzialità offerte: tutte le conoscenze acquisite servono a progettare uno strumento di analisi dati flessibile e interattivo, con il quale è possibile svolgere analisi non proponibili con le altre soluzioni analizzate. La progettazione dello strumento di analisi dei dati è orientata a definire un modello multidimensionale per la rappresentazione delle informazioni: il percorso di progettazione richiede di individuare le informazioni d’interesse per l’utente, così da poterle reintrodurre in fase di progettazione della nuova base dati. L’infrastruttura finale di questa nuova funzionalità si concretizza in un data warehouse: tutte le informazioni di analisi sono memorizzare su una base dati diversa da quella di On.Energy, evitando di correlare le prestazione dei due diversi sottosistemi. L’utilizzo di un data warehouse pone le basi per realizzare analisi su lunghi periodi temporali: tutte le tipologie di interrogazione dati comprendono un enorme quantità d’informazioni, esattamente in linea con le caratteristiche delle interrogazioni OLAP

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In multivariate time series analysis, the equal-time cross-correlation is a classic and computationally efficient measure for quantifying linear interrelations between data channels. When the cross-correlation coefficient is estimated using a finite amount of data points, its non-random part may be strongly contaminated by a sizable random contribution, such that no reliable conclusion can be drawn about genuine mutual interdependencies. The random correlations are determined by the signals' frequency content and the amount of data points used. Here, we introduce adjusted correlation matrices that can be employed to disentangle random from non-random contributions to each matrix element independently of the signal frequencies. Extending our previous work these matrices allow analyzing spatial patterns of genuine cross-correlation in multivariate data regardless of confounding influences. The performance is illustrated by example of model systems with known interdependence patterns. Finally, we apply the methods to electroencephalographic (EEG) data with epileptic seizure activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background mortality is an essential component of any forest growth and yield model. Forecasts of mortality contribute largely to the variability and accuracy of model predictions at the tree, stand and forest level. In the present study, I implement and evaluate state-of-the-art techniques to increase the accuracy of individual tree mortality models, similar to those used in many of the current variants of the Forest Vegetation Simulator, using data from North Idaho and Montana. The first technique addresses methods to correct for bias induced by measurement error typically present in competition variables. The second implements survival regression and evaluates its performance against the traditional logistic regression approach. I selected the regression calibration (RC) algorithm as a good candidate for addressing the measurement error problem. Two logistic regression models for each species were fitted, one ignoring the measurement error, which is the “naïve” approach, and the other applying RC. The models fitted with RC outperformed the naïve models in terms of discrimination when the competition variable was found to be statistically significant. The effect of RC was more obvious where measurement error variance was large and for more shade-intolerant species. The process of model fitting and variable selection revealed that past emphasis on DBH as a predictor variable for mortality, while producing models with strong metrics of fit, may make models less generalizable. The evaluation of the error variance estimator developed by Stage and Wykoff (1998), and core to the implementation of RC, in different spatial patterns and diameter distributions, revealed that the Stage and Wykoff estimate notably overestimated the true variance in all simulated stands, but those that are clustered. Results show a systematic bias even when all the assumptions made by the authors are guaranteed. I argue that this is the result of the Poisson-based estimate ignoring the overlapping area of potential plots around a tree. Effects, especially in the application phase, of the variance estimate justify suggested future efforts of improving the accuracy of the variance estimate. The second technique implemented and evaluated is a survival regression model that accounts for the time dependent nature of variables, such as diameter and competition variables, and the interval-censored nature of data collected from remeasured plots. The performance of the model is compared with the traditional logistic regression model as a tool to predict individual tree mortality. Validation of both approaches shows that the survival regression approach discriminates better between dead and alive trees for all species. In conclusion, I showed that the proposed techniques do increase the accuracy of individual tree mortality models, and are a promising first step towards the next generation of background mortality models. I have also identified the next steps to undertake in order to advance mortality models further.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a statistical inference scenario, the estimation of target signal or its parameters is done by processing data from informative measurements. The estimation performance can be enhanced if we choose the measurements based on some criteria that help to direct our sensing resources such that the measurements are more informative about the parameter we intend to estimate. While taking multiple measurements, the measurements can be chosen online so that more information could be extracted from the data in each measurement process. This approach fits well in Bayesian inference model often used to produce successive posterior distributions of the associated parameter. We explore the sensor array processing scenario for adaptive sensing of a target parameter. The measurement choice is described by a measurement matrix that multiplies the data vector normally associated with the array signal processing. The adaptive sensing of both static and dynamic system models is done by the online selection of proper measurement matrix over time. For the dynamic system model, the target is assumed to move with some distribution and the prior distribution at each time step is changed. The information gained through adaptive sensing of the moving target is lost due to the relative shift of the target. The adaptive sensing paradigm has many similarities with compressive sensing. We have attempted to reconcile the two approaches by modifying the observation model of adaptive sensing to match the compressive sensing model for the estimation of a sparse vector.