15 resultados para Aftermath of cerebrovascular event
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.
Resumo:
The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.
Resumo:
In this thesis the analysis to reconstruct the transverse momentum p_{t} spectra for pions, kaons and protons identified with the TOF detector of the ALICE experiment in pp Minimum Bias collisions at $\sqrt{s}=7$ TeV was reported.
After a detailed description of all the parameters which influence the TOF PID performance (time resolution, calibration, alignment, matching efficiency, time-zero of the event) the method used to identify the particles, the unfolding procedure, was discussed. With this method, thanks also to the excellent TOF performance, the pion and kaon spectra can be reconstructed in the 0.5
Resumo:
Flood disasters are a major cause of fatalities and economic losses, and several studies indicate that global flood risk is currently increasing. In order to reduce and mitigate the impact of river flood disasters, the current trend is to integrate existing structural defences with non structural measures. This calls for a wider application of advanced hydraulic models for flood hazard and risk mapping, engineering design, and flood forecasting systems. Within this framework, two different hydraulic models for large scale analysis of flood events have been developed. The two models, named CA2D and IFD-GGA, adopt an integrated approach based on the diffusive shallow water equations and a simplified finite volume scheme. The models are also designed for massive code parallelization, which has a key importance in reducing run times in large scale and high-detail applications. The two models were first applied to several numerical cases, to test the reliability and accuracy of different model versions. Then, the most effective versions were applied to different real flood events and flood scenarios. The IFD-GGA model showed serious problems that prevented further applications. On the contrary, the CA2D model proved to be fast and robust, and able to reproduce 1D and 2D flow processes in terms of water depth and velocity. In most applications the accuracy of model results was good and adequate to large scale analysis. Where complex flow processes occurred local errors were observed, due to the model approximations. However, they did not compromise the correct representation of overall flow processes. In conclusion, the CA model can be a valuable tool for the simulation of a wide range of flood event types, including lowland and flash flood events.
Resumo:
The arousal scoring in Obstructive Sleep Apnea Syndrome (OSAS) is important to clarify the impact of the disease on sleep but the currently applied American Academy of Sleep Medicine (AASM) definition may underestimate the subtle alterations of sleep. The aims of the present study were to evaluate the impact of respiratory events on cortical and autonomic arousal response and to quantify the additional value of cyclic alternating pattern (CAP) and pulse wave amplitude (PWA) for a more accurate detection of respiratory events and sleep alterations in OSAS patients. A retrospective revision of 19 polysomnographic recordings of OSAS patients was carried out. Analysis was focused on quantification of apneas (AP), hypopneas (H) and flow limitation (FL) events, and on investigation of cerebral and autonomic activity. Only 41.1% of FL events analyzed in non rapid eye movement met the AASM rules for the definition of respiratory event-related arousal (RERA), while 75.5% of FL events ended with a CAP A phase. The dual response (EEG-PWA) was the most frequent response for all subtypes of respiratory event with a progressive reduction from AP to H and FL. 87.7% of respiratory events with EEG activation showed also a PWA drop and 53,4% of the respiratory events without EEG activation presented a PWA drop. The relationship between the respiratory events and the arousal response is more complex than that suggested by the international classification. In the estimation of the response to respiratory events, the CAP scoring and PWA analysis can offer more extensive information compared to the AASM rules. Our data confirm also that the application of PWA scoring improves the detection of respiratory events and could reduce the underestimation of OSAS severity compared to AASM arousal.
Resumo:
Atrial fibrillation is associated with a five-fold increase in the risk of cerebrovascular events,being responsible of 15-18% of all strokes.The morphological and functional remodelling of the left atrium caused by atrial fibrillation favours blood stasis and, consequently, stroke risk. In this context, several clinical studies suggest that stroke risk stratification could be improved by using haemodynamic information on the left atrium (LA) and the left atrial appendage (LAA). The goal of this study was to develop a personalized computational fluid-dynamics (CFD) model of the left atrium which could clarify the haemodynamic implications of atrial fibrillation on a patient specific basis. The developed CFD model was first applied to better understand the role of LAA in stroke risk. Infact, the interplay of the LAA geometric parameters such as LAA length, tortuosity, surface area and volume with the fluid-dynamics parameters and the effects of the LAA closure have not been investigated. Results demonstrated the capabilities of the CFD model to reproduce the real physiological behaviour of the blood flow dynamics inside the LA and the LAA. Finally, we determined that the fluid-dynamics parameters enhanced in this research project could be used as new quantitative indexes to describe the different types of AF and open new scenarios for the patient-specific stroke risk stratification.
Resumo:
This thesis takes two perspectives on political institutions. From the one side, it examines the long-run effects of institutions on cultural values. From the other side, I study strategic communication, and its determinants, of politicians, a pivotal actor inside those institutions. The first chapter provides evidence for the legacy of feudalism - a set of labor coercion and migration restrictions -, on interpersonal distrust. I combining administrative data on the feudal system in the Prussian Empire (1816 – 1849) with the geo-localized survey data from the German Socio-Economic Panel (1980 – 2020). I show that areas with strong historical exposure to feudalism have lower levels of inter-personal trust today, by means of OLS- and mover specifications. The second chapter builds a novel dataset that includes the Twitter handles of 18,000+ politicians and 61+ million tweets from 2008 – 2021 from all levels of government. I find substantial partisan differences in Twitter adoption, Twitter activity and audience engagement. I use established tools to measure ideological polarization to provide evidence that online-polarization follows similar trends to offline-polarization, at comparable magnitude and reaches unprecedented heights in 2018 and 2021. I develop a new tool to demonstrate a marked increase in affective polarization. The third chapter tests whether politicians disseminate distortive messages when exposed to bad news. Specifically, I study the diffusion of misleading communication from pro-gun politicians in the aftermath of mass shootings. I exploit the random timing of mass shootings and analyze half a million tweets between 2010 – 2020 in an event-study design. I develop and apply state-of-the-art text analysis tools to show that pro- gun politicians seek to decrease the salience of the mass shooting through distraction and try to alter voters’ belief formation through misrepresenting the causes of the mass shootings.
Resumo:
Objective: To investigate the association between the four traditional coronary heart disease (CHD) risk factors (hypertension, smoking, hypercholesterolemia, and diabetes) and outcomes of first ACS. Methods: Data were drawn from the ISACS Archives. The study participants consisted of 70953 patients with first ACS, but without prior CHD. Primary outcomes were patient’ age at hospital presentation and 30-day all-cause mortality. The risk ratios for mortality among subgroups were calculated using a balancing strategy by inverse probability weighting. Trends were evaluated by Pearson's correlation coefficient (r). Results: For fatal ACS (n=6097), exposure to at least one traditional CHD-risk factor ranged from 77.6% in women to 74.5% in men. The presence of all four CHD-risk factors significantly decreased the age at time of ACS event and death by nearly half a decade compared with the absence of any traditional risk factors in both women (from 67.1±12.0 to 61.9±10.3 years; r=-0.089, P<0.001) and men (from 62.8±12.2 to 58.9±9.9 years; r=-0.096, P<0.001). By contrast, there was an inverse association between the number of traditional CHD-risk factors and 30-day mortality. The mortality rates in women ranged from 7.7% with four traditional CHD-risk factors to 16.3% with no traditional risk factors (r=0.073, P<0.001). The corresponding rates in men were 4.8% and 11.5% (r=0.078, P<0.001), respectively. The risk ratios among individuals with at least one CHD-risk factors vs. those with no traditional risk factors were 0.72 (95%CI:0.65-0.79) in women and 0.64 (95%CI:0.59-0.70) in men. This association was consistent among patient subgroups managed with guideline-recommended therapeutic options. Conclusions: The vast majority of patients who die for ACS have traditional CHD-risk factor exposure. Patients with CHD-risk factors die much earlier in life, but they have a lower relative risk of 30-day mortality than those with no traditional CHD-risk factors, even in the context of equitable evidence‐based treatments after hospital admission.
Resumo:
Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
Il ginandromorfismo è il fenomeno per il quale un organismo manifesta contemporaneamente caratteristiche fenotipiche maschili e femminili. Per quanto riguarda la Classe degli Insetti, numerose segnalazioni di tale manifestazione sono reperibili in letteratura, ma un’ interpretazione generale sulle origini e sulle cause che la generano non è ancora stata fornita. Lo scopo di questa tesi è stato quello di studiare il fenomeno per quanto riguarda l’Imenottero Diprionide Diprion pini (Linnaeus, 1758) attraverso l’allevamento controllato dell’insetto, esperimenti di inincrocio, studio del cariotipo e la valutazione della comparsa e la distribuzione dei tessuti maschili e femminili negli individui ginandromorfi. Altri parametri biologici (quali i pesi degli individui) sono stati presi in considerazione nel tentativo di fornire una spiegazione riguardo i meccanismi genetici che regolano la determinazione del sesso in questa specie. Gynandromorphism is the phenomenon by which an organism manifests phenotypic characteristics both male and female. For the class of insects, numerous reports of this event can be found in the literature, but a general interpretation of the origins and causes that generate it has not yet been provided. The purpose of this thesis was to study the phenomenon with regard to the Diprionid wasp Diprion pini (Linnaeus, 1758) through the controlled rearing of the insect, inbreeding experiments, study of the karyotype and evaluation of the appearance and distribution of male and female tissue in gynandromorph specimens. Other biological parameters (such as the weights of individuals) were taken into account in an attempt to provide an explanation of the genetic mechanisms that regulate sex determination in this species.
Resumo:
Background: Nilotinib is a potent and selective BCR-ABL inhibitor. The phase 3 ENESTnd trial demonstrated superior efficacy nilotinib vs imatinib, with higher and faster molecular responses. After 24 months, the rates of progression to accelerated-blastic phase (ABP) were 0.7% and 1.1% with nilotinib 300mg and 400mg BID, respectively, significantly lower compared to imatinib (4.2%). Nilotinib has been approved for the frontline treatment of Ph+ CML. With imatinib 400mg (IRIS trial), the rate of any event and of progression to ABP were higher during the first 3 years. Consequently, a confirmation of the durability of responses to nilotinib beyond 3 years is extremely important. Aims: To evaluate the response and the outcome of patients treated for 3 years with nilotinib 400mg BID as frontline therapy. Methods: A multicentre phase 2 trial was conducted by the GIMEMA CML WP (ClinicalTrials.gov.NCT00481052). Minimum 36-month follow-up data for all patients will be presented. Definitions: Major Molecular Response (MMR): BCR-ABL/ABL ratio <0,1%IS; Complete Molecular Response (CMR): undetectable transcript levels with ≥10,000 ABL transcripts; failures: according to the revised ELN recommendations; events: failures and treatment discontinuation for any reason. All the analysis has been made according to the intention-to-treat principle. Results: 73 patients enrolled: median age 51 years; 45% low, 41% intermediate and 14% high Sokal risk. The cumulative incidence of CCgR at 12 months was 100%. CCgR at each milestone: 78%, 96%, 96%, 95%, 92% at 3, 6, 12, 18 and 24 months, respectively. The overall estimated probability of MMR was 97%, while the rates of MMR at 3, 6, 12, 18 and 24 months were 52%, 66%, 85%, 81% and 82%, respectively. The overall estimated probability of CMR was 79%, while the rates of CMR at 12 and 24 months were 12% and 27%, respectively. No patient achieving a MMR progressed to AP. Only one patient progressed at 6 months to ABP and subsequently died (high Sokal risk, T315I mutation). Adverse events were mostly grade 1 or 2 and manageable with appropriate dose adaptations. During the first 12 months, the mean daily dose was 600-800mg in 74% of patients. The nilotinib last daily dose was as follows: 800mg in 46 (63%) patients, 600mg in 3 (4%) patients and 400mg in 18 (25%), 6 permanent discontinuations. Detail of discontinuation: 1 patient progressed to ABP; 3 patients had recurrent episodes of amylase and/or lipase increase (no pancreatitis); 1 patient had atrial fibrillation (unrelated to study drug) and 1 patient died after 32 months of mental deterioration and starvation (unrelated to study drug). Two patients are currently on imatinib second-line and 2 on dasatinib third-line. With a median follow-up of 39 months, the estimated probability of overall survival, progression-free survival and failure-free survival was 97%, the estimated probability of event-free survival was 91%. Conclusions: The rate of failures was very low during the first 3 years. Responses remain stable. The high rates of responses achieved during the first 12 months are being translated into optimal outcome for most of patients.
Resumo:
Questa tesi di dottorato è inserita nell’ambito della convenzione tra ARPA_SIMC (che è l’Ente finanziatore), l’Agenzia Regionale di Protezione Civile ed il Dipartimento di Scienze della Terra e Geologico - Ambientali dell’Ateneo di Bologna. L’obiettivo principale è la determinazione di possibili soglie pluviometriche di innesco per i fenomeni franosi in Emilia Romagna che possano essere utilizzate come strumento di supporto previsionale in sala operativa di Protezione Civile. In un contesto geologico così complesso, un approccio empirico tradizionale non è sufficiente per discriminare in modo univoco tra eventi meteo innescanti e non, ed in generale la distribuzione dei dati appare troppo dispersa per poter tracciare una soglia statisticamente significativa. È stato quindi deciso di applicare il rigoroso approccio statistico Bayesiano, innovativo poiché calcola la probabilità di frana dato un certo evento di pioggia (P(A|B)) , considerando non solo le precipitazioni innescanti frane (quindi la probabilità condizionata di avere un certo evento di precipitazione data l’occorrenza di frana, P(B|A)), ma anche le precipitazioni non innescanti (quindi la probabilità a priori di un evento di pioggia, P(A)). L’approccio Bayesiano è stato applicato all’intervallo temporale compreso tra il 1939 ed il 2009. Le isolinee di probabilità ottenute minimizzano i falsi allarmi e sono facilmente implementabili in un sistema di allertamento regionale, ma possono presentare limiti previsionali per fenomeni non rappresentati nel dataset storico o che avvengono in condizioni anomale. Ne sono esempio le frane superficiali con evoluzione in debris flows, estremamente rare negli ultimi 70 anni, ma con frequenza recentemente in aumento. Si è cercato di affrontare questo problema testando la variabilità previsionale di alcuni modelli fisicamente basati appositamente sviluppati a questo scopo, tra cui X – SLIP (Montrasio et al., 1998), SHALSTAB (SHALlow STABility model, Montgomery & Dietrich, 1994), Iverson (2000), TRIGRS 1.0 (Baum et al., 2002), TRIGRS 2.0 (Baum et al., 2008).
Resumo:
La monografia propone un’analisi del periodo ca. 478-461 a.C. della storia ateniese e delle vicende di Cimone figlio di Milziade entro il contesto contemporaneo. Lo studio dell’Atene, e più in generale di varie realtà elleniche affacciate sull’Egeo, negli immediati anni ‘post-persiani’ si articola in due parti: la prima ripercorre in senso cronologico le notizie, essenzialmente letterarie, disponibili in merito alle attività politiche e militari di Atene, quale guida dell’alleanza greca; la seconda trae conclusioni di respiro più ampio, fondate sull’analisi precedente, e cerca una sintesi del periodo e del personaggio nel superamento di stereotipi e condizionamenti letterari. In tale ottica si esprime una riflessione, a partire dalla scarna trattazione tucididea, sui meccanismi attraverso i quali la tradizione ha deformato e sedimentato le informazioni disponibili generando un progressivo arricchimento che ha portato alla definizione, di fatto, di una ‘era cimoniana’ che è possibile mettere in discussione in alcuni tratti essenziali. Si mira dunque a proporre una valutazione del periodo priva di alcuni elementi, in ultimo evidenti soprattutto nell’approccio plutarcheo alla materia, che appaiono alieni al contesto di riferimento per la prima parte del V secolo: i temi principali ai quali si dedica la riflessione sono quelli dell’imperialismo ateniese, del filolaconismo, della bipolarità tra democrazia e oligarchia, della propaganda politico-mitologica. Il rapporto di Atene con Sparta, con gli Ioni e con le altre realtà del mondo egeo viene letto alla luce degli indizi disponibili sul clima di incertezza e di delicati equilibri creatosi all’indomani della ritirata delle forze persiane. Il ritratto di Cimone che si propone è quello di una figura indubbiamente significativa nella politica contemporanea ma, al contempo, fortemente condizionata e talora adombrata da dinamiche in qualche modo condivise all’interno dello scenario politico ateniese, improntato alla soddisfazione di necessità e volontà che la tradizione renderà archetipiche del paradigma democratico.
Resumo:
The energy released during a seismic crisis in volcanic areas is strictly related to the physical processes in the volcanic structure. In particular Long Period seismicity, that seems to be related to the oscillation of a fluid-filled crack (Chouet , 1996, Chouet, 2003, McNutt, 2005), can precedes or accompanies an eruption. The present doctoral thesis is focused on the study of the LP seismicity recorded in the Campi Flegrei volcano (Campania, Italy) during the October 2006 crisis. Campi Flegrei Caldera is an active caldera; the combination of an active magmatic system and a dense populated area make the Campi Flegrei a critical volcano. The source dynamic of LP seismicity is thought to be very different from the other kind of seismicity ( Tectonic or Volcano Tectonic): it’s characterized by a time sustained source and a low content in frequency. This features implies that the duration–magnitude, that is commonly used for VT events and sometimes for LPs as well, is unadapted for LP magnitude evaluation. The main goal of this doctoral work was to develop a method for the determination of the magnitude for the LP seismicity; it’s based on the comparison of the energy of VT event and LP event, linking the energy to the VT moment magnitude. So the magnitude of the LP event would be the moment magnitude of a VT event with the same energy of the LP. We applied this method to the LP data-set recorded at Campi Flegrei caldera in 2006, to an LP data-set of Colima volcano recorded in 2005 – 2006 and for an event recorded at Etna volcano. Experimenting this method to lots of waveforms recorded at different volcanoes we tested its easy applicability and consequently its usefulness in the routinely and in the quasi-real time work of a volcanological observatory.