7 resultados para Web log analysis
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
L’elaborato ha lo scopo di presentare le nuove opportunità di business offerte dal Web. Il rivoluzionario cambiamento che la pervasività della Rete e tutte le attività correlate stanno portando, ha posto le aziende davanti ad un diverso modo di relazionarsi con i propri consumatori, che sono sempre più informati, consapevoli ed esigenti, e con la concorrenza. La sfida da accettare per rimanere competitivi sul mercato è significativa e il mutamento in rapido sviluppo: gli aspetti che contraddistinguono questo nuovo paradigma digitale sono, infatti, velocità, mutevolezza, ma al tempo stesso misurabilità, ponderabilità, previsione. Grazie agli strumenti tecnologici a disposizione e alle dinamiche proprie dei diversi spazi web (siti, social network, blog, forum) è possibile tracciare più facilmente, rispetto al passato, l’impatto di iniziative, lanci di prodotto, promozioni e pubblicità, misurandone il ritorno sull’investimento, oltre che la percezione dell’utente finale. Un approccio datacentrico al marketing, attraverso analisi di monitoraggio della rete, permette quindi al brand investimenti più mirati e ponderati sulla base di stime e previsioni. Tra le più significative strategie di marketing digitale sono citate: social advertising, keyword advertising, digital PR, social media, email marketing e molte altre. Sono riportate anche due case history: una come ottimo esempio di co-creation in cui il brand ha coinvolto direttamente il pubblico nel processo di produzione del prodotto, affidando ai fan della Pagina Facebook ufficiale la scelta dei gusti degli yogurt da mettere in vendita. La seconda, caso internazionale di lead generation, ha permesso al brand di misurare la conversione dei visitatori del sito (previa compilazione di popin) in reali acquirenti, collegando i dati di traffico del sito a quelli delle vendite. Esempio di come online e offline comunichino strettamente.
Resumo:
Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.
Resumo:
As distributed collaborative applications and architectures are adopting policy based management for tasks such as access control, network security and data privacy, the management and consolidation of a large number of policies is becoming a crucial component of such policy based systems. In large-scale distributed collaborative applications like web services, there is the need of analyzing policy interactions and integrating policies. In this thesis, we propose and implement EXAM-S, a comprehensive environment for policy analysis and management, which can be used to perform a variety of functions such as policy property analyses, policy similarity analysis, policy integration etc. As part of this environment, we have proposed and implemented new techniques for the analysis of policies that rely on a deep study of state of the art techniques. Moreover, we propose an approach for solving heterogeneity problems that usually arise when considering the analysis of policies belonging to different domains. Our work focuses on analysis of access control policies written in the dialect of XACML (Extensible Access Control Markup Language). We consider XACML policies because XACML is a rich language which can represent many policies of interest to real world applications and is gaining widespread adoption in the industry.
Resumo:
Our research takes place in the context of a discipline kwown as Communication for Development, sited inside the field of Communication for Social Change, characterized by the use of interpersonal ad mass communication theories and tools, applyied to international development cooperation. Our study aims at pointing out a change of paradigm in this field: our object is Public Administration’s communication, therefore, what we suggest is a shift from Communication for Development, to Development Communication. The object of our study, hence, becomes the discourse itself, in its double action of representation and construction of reality. In particular, we are interested in the discourse’s tribute to the creation of a collective immagination, wich is the perspective towards which we have oriented the analysis, through a structuralist semoitics-based methodology integrated with a socio-semiotic approach. Taking into consideartion the fact that in our contemporary society (that is to say a ‘Western’ and ‘First World’ society), the internet is a crucial public space for the mediation and the management of collective immagination, we chose the web sites of Public Bodies which are dedicated to International Cooperation has our analysis corpus. This, due to their symbolic and ideologic significance, as well as for the actual political responsibility we think these web sites should have. The result of our analysis allows us to suggest some discoursive strategies used in the web sites of Public Bodies. In these sites, there is a tendency to shift the discourses around international cooperation from the ideological axis - avoiding in so doing to explicit a political statement about the causes of injustices and un-balances which lead to the necessity of a support in development (i.e. avoiding to mention values such as social justice and democracy while acknowledging socio-economical institutions which contribute to foster underdevelopment on a global scale) -, to the ethical axis, hence referring to moral values concerning the private sphere (human solidarity and charity), which is delegated mainly to non governamental associations.
Resumo:
The present work is devoted to the assessment of the energy fluxes physics in the space of scales and physical space of wall-turbulent flows. The generalized Kolmogorov equation will be applied to DNS data of a turbulent channel flow in order to describe the energy fluxes paths from production to dissipation in the augmented space of wall-turbulent flows. This multidimensional description will be shown to be crucial to understand the formation and sustainment of the turbulent fluctuations fed by the energy fluxes coming from the near-wall production region. An unexpected behavior of the energy fluxes comes out from this analysis consisting of spiral-like paths in the combined physical/scale space where the controversial reverse energy cascade plays a central role. The observed behavior conflicts with the classical notion of the Richardson/Kolmogorov energy cascade and may have strong repercussions on both theoretical and modeling approaches to wall-turbulence. To this aim a new relation stating the leading physical processes governing the energy transfer in wall-turbulence is suggested and shown able to capture most of the rich dynamics of the shear dominated region of the flow. Two dynamical processes are identified as driving mechanisms for the fluxes, one in the near wall region and a second one further away from the wall. The former, stronger one is related to the dynamics involved in the near-wall turbulence regeneration cycle. The second suggests an outer self-sustaining mechanism which is asymptotically expected to take place in the log-layer and could explain the debated mixed inner/outer scaling of the near-wall statistics. The same approach is applied for the first time to a filtered velocity field. A generalized Kolmogorov equation specialized for filtered velocity field is derived and discussed. The results will show what effects the subgrid scales have on the resolved motion in both physical and scale space, singling out the prominent role of the filter length compared to the cross-over scale between production dominated scales and inertial range, lc, and the reverse energy cascade region lb. The systematic characterization of the resolved and subgrid physics as function of the filter scale and of the wall-distance will be shown instrumental for a correct use of LES models in the simulation of wall turbulent flows. Taking inspiration from the new relation for the energy transfer in wall turbulence, a new class of LES models will be also proposed. Finally, the generalized Kolmogorov equation specialized for filtered velocity fields will be shown to be an helpful statistical tool for the assessment of LES models and for the development of new ones. As example, some classical purely dissipative eddy viscosity models are analyzed via an a priori procedure.
Resumo:
Principale obiettivo della ricerca è quello di ricostruire lo stato dell’arte in materia di sanità elettronica e Fascicolo Sanitario Elettronico, con una precipua attenzione ai temi della protezione dei dati personali e dell’interoperabilità. A tal fine sono stati esaminati i documenti, vincolanti e non, dell’Unione europea nonché selezionati progetti europei e nazionali (come “Smart Open Services for European Patients” (EU); “Elektronische Gesundheitsakte” (Austria); “MedCom” (Danimarca); “Infrastruttura tecnologica del Fascicolo Sanitario Elettronico”, “OpenInFSE: Realizzazione di un’infrastruttura operativa a supporto dell’interoperabilità delle soluzioni territoriali di fascicolo sanitario elettronico nel contesto del sistema pubblico di connettività”, “Evoluzione e interoperabilità tecnologica del Fascicolo Sanitario Elettronico”, “IPSE - Sperimentazione di un sistema per l’interoperabilità europea e nazionale delle soluzioni di Fascicolo Sanitario Elettronico: componenti Patient Summary e ePrescription” (Italia)). Le analisi giuridiche e tecniche mostrano il bisogno urgente di definire modelli che incoraggino l’utilizzo di dati sanitari ed implementino strategie effettive per l’utilizzo con finalità secondarie di dati sanitari digitali , come Open Data e Linked Open Data. L’armonizzazione giuridica e tecnologica è vista come aspetto strategico per ridurre i conflitti in materia di protezione di dati personali esistenti nei Paesi membri nonché la mancanza di interoperabilità tra i sistemi informativi europei sui Fascicoli Sanitari Elettronici. A questo scopo sono state individuate tre linee guida: (1) armonizzazione normativa, (2) armonizzazione delle regole, (3) armonizzazione del design dei sistemi informativi. I principi della Privacy by Design (“prottivi” e “win-win”), così come gli standard del Semantic Web, sono considerate chiavi risolutive per il suddetto cambiamento.
Resumo:
Changepoint analysis is a well established area of statistical research, but in the context of spatio-temporal point processes it is as yet relatively unexplored. Some substantial differences with regard to standard changepoint analysis have to be taken into account: firstly, at every time point the datum is an irregular pattern of points; secondly, in real situations issues of spatial dependence between points and temporal dependence within time segments raise. Our motivating example consists of data concerning the monitoring and recovery of radioactive particles from Sandside beach, North of Scotland; there have been two major changes in the equipment used to detect the particles, representing known potential changepoints in the number of retrieved particles. In addition, offshore particle retrieval campaigns are believed may reduce the particle intensity onshore with an unknown temporal lag; in this latter case, the problem concerns multiple unknown changepoints. We therefore propose a Bayesian approach for detecting multiple changepoints in the intensity function of a spatio-temporal point process, allowing for spatial and temporal dependence within segments. We use Log-Gaussian Cox Processes, a very flexible class of models suitable for environmental applications that can be implemented using integrated nested Laplace approximation (INLA), a computationally efficient alternative to Monte Carlo Markov Chain methods for approximating the posterior distribution of the parameters. Once the posterior curve is obtained, we propose a few methods for detecting significant change points. We present a simulation study, which consists in generating spatio-temporal point pattern series under several scenarios; the performance of the methods is assessed in terms of type I and II errors, detected changepoint locations and accuracy of the segment intensity estimates. We finally apply the above methods to the motivating dataset and find good and sensible results about the presence and quality of changes in the process.