12 resultados para Home monitoring system
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In recent years, composite materials have revolutionized the design of many structures. Their superior mechanical properties and light weight make composites convenient over traditional metal structures for many applications. However, composite materials are susceptible to complex and challenging to predict damage behaviors due to their anisotropy nature. Therefore, structural Health Monitoring (SHM) can be a valuable tool to assess the damage and understand the physics underneath. Distributed Optical Fiber Sensors (DOFS) can be used to monitor several types of damage in composites. However, their implementation outside academia is still unsatisfactory. One of the hindrances is the lack of a rigorous methodology for uncertainty quantification, which is essential for the performance assessment of the monitoring system. The concept of Probability of Detection (POD) must function as the guiding light in this process. However, precautions must be taken since this tool was established for Non-Destructive Evaluation (NDE) rather than Structural Health Monitoring (SHM). In addition, although DOFS have been the object of numerous studies, a well-established POD methodology for their performance assessment is still missing. This thesis aims to develop a methodology to produce POD curves for DOFS in composite materials. The problem is analyzed considering several critical points, such as the strain transfer characterizing the DOFS and the development of an experimental and model-assisted methodology to understand the parameters that affect the DOFS performance.
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.
Resumo:
Life is full of uncertainties. Legal rules should have a clear intention, motivation and purpose in order to diminish daily uncertainties. However, practice shows that their consequences are complex and hard to predict. For instance, tort law has the general objectives of deterring future negligent behavior and compensating the victims of someone else's negligence. Achieving these goals are particularly difficult in medical malpractice cases. To start with, when patients search for medical care they are typically sick in the first place. In case harm materializes during the treatment, it might be very hard to assess if it was due to substandard medical care or to the patient's poor health conditions. Moreover, the practice of medicine has a positive externality on the society, meaning that the design of legal rules is crucial: for instance, it should not result in physicians avoiding practicing their activity just because they are afraid of being sued even when they acted according to the standard level of care. The empirical literature on medical malpractice has been developing substantially in the past two decades, with the American case being the most studied one. Evidence from civil law tradition countries is more difficult to find. The aim of this thesis is to contribute to the empirical literature on medical malpractice, using two civil law countries as a case-study: Spain and Italy. The goal of this thesis is to investigate, in the first place, some of the consequences of having two separate sub-systems (administrative and civil) coexisting within the same legal system, which is common in civil law tradition countries with a public national health system (such as Spain, France and Portugal). When this holds, different procedures might apply depending on the type of hospital where the injury took place (essentially whether it is a public hospital or a private hospital). Therefore, a patient injured in a public hospital should file a claim in administrative courts while a patient suffering an identical medical accident should file a claim in civil courts. A natural question that the reader might pose is why should both administrative and civil courts decide medical malpractice cases? Moreover, can this specialization of courts influence how judges decide medical malpractice cases? In the past few years, there was a general concern with patient safety, which is currently on the agenda of several national governments. Some initiatives have been taken at the international level, with the aim of preventing harm to patients during treatment and care. A negligently injured patient might present a claim against the health care provider with the aim of being compensated for the economic loss and for pain and suffering. In several European countries, health care is mainly provided by a public national health system, which means that if a patient harmed in a public hospital succeeds in a claim against the hospital, public expenditures increase because the State takes part in the litigation process. This poses a problem in a context of increasing national health expenditures and public debt. In Italy, with the aim of increasing patient safety, some regions implemented a monitoring system on medical malpractice claims. However, if properly implemented, this reform shall also allow for a reduction in medical malpractice insurance costs. This thesis is organized as follows. Chapter 1 provides a review of the empirical literature on medical malpractice, where studies on outcomes and merit of claims, costs and defensive medicine are presented. Chapter 2 presents an empirical analysis of medical malpractice claims arriving to the Spanish Supreme Court. The focus is on reversal rates for civil and administrative decisions. Administrative decisions appealed by the plaintiff have the highest reversal rates. The results show a bias in lower administrative courts, which tend to focus on the State side. We provide a detailed explanation for these results, which can rely on the organization of administrative judges career. Chapter 3 assesses predictors of compensation in medical malpractice cases appealed to the Spanish Supreme Court and investigates the amount of damages attributed to patients. The results show horizontal equity between administrative and civil decisions (controlling for observable case characteristics) and vertical inequity (patients suffering more severe injuries tend to receive higher payouts). In order to execute these analyses, a database of medical malpractice decisions appealed to the Administrative and Civil Chambers of the Spanish Supreme Court from 2006 until 2009 (designated by the Spanish Supreme Court Medical Malpractice Dataset (SSCMMD)) has been created. A description of how the SSCMMD was built and of the Spanish legal system is presented as well. Chapter 4 includes an empirical investigation of the effect of a monitoring system for medical malpractice claims on insurance premiums. In Italy, some regions adopted this policy in different years, while others did not. The study uses data on insurance premiums from Italian public hospitals for the years 2001-2008. This is a significant difference as most of the studies use the insurance company as unit of analysis. Although insurance premiums have risen from 2001 to 2008, the increase was lower for regions adopting a monitoring system for medical claims. Possible implications of this system are also provided. Finally, Chapter 5 discusses the main findings, describes possible future research and concludes.
Resumo:
Il problema dell'antibiotico-resistenza è un problema di sanità pubblica per affrontare il quale è necessario un sistema di sorveglianza basato sulla raccolta e l'analisi dei dati epidemiologici di laboratorio. Il progetto di dottorato è consistito nello sviluppo di una applicazione web per la gestione di tali dati di antibiotico sensibilità di isolati clinici utilizzabile a livello di ospedale. Si è creata una piattaforma web associata a un database relazionale per avere un’applicazione dinamica che potesse essere aggiornata facilmente inserendo nuovi dati senza dover manualmente modificare le pagine HTML che compongono l’applicazione stessa. E’ stato utilizzato il database open-source MySQL in quanto presenta numerosi vantaggi: estremamente stabile, elevate prestazioni, supportato da una grande comunità online ed inoltre gratuito. Il contenuto dinamico dell’applicazione web deve essere generato da un linguaggio di programmazione tipo “scripting” che automatizzi operazioni di inserimento, modifica, cancellazione, visualizzazione di larghe quantità di dati. E’ stato scelto il PHP, linguaggio open-source sviluppato appositamente per la realizzazione di pagine web dinamiche, perfettamente utilizzabile con il database MySQL. E’ stata definita l’architettura del database creando le tabelle contenenti i dati e le relazioni tra di esse: le anagrafiche, i dati relativi ai campioni, microrganismi isolati e agli antibiogrammi con le categorie interpretative relative al dato antibiotico. Definite tabelle e relazioni del database è stato scritto il codice associato alle funzioni principali: inserimento manuale di antibiogrammi, importazione di antibiogrammi multipli provenienti da file esportati da strumenti automatizzati, modifica/eliminazione degli antibiogrammi precedenti inseriti nel sistema, analisi dei dati presenti nel database con tendenze e andamenti relativi alla prevalenza di specie microbiche e alla chemioresistenza degli stessi, corredate da grafici. Lo sviluppo ha incluso continui test delle funzioni via via implementate usando reali dati clinici e sono stati introdotti appositi controlli e l’introduzione di una semplice e pulita veste grafica.
Resumo:
L'approvvigionamento di risorse minerali e la tutela dell'ambiente sono spesso considerate attività contrapposte ed inconciliabili, ma in realtà rappresentano due necessità imprescindibili per le società moderne. Le georisorse, in quanto non rinnovabili, devono essere valorizzate in maniera efficiente, adoperando strumenti che garantiscano la sostenibilità ambientale, sociale ed economica degli interventi estrattivi. La necessità di tutelare il territorio e migliorare la qualità della vita delle comunità locali impone alla Pubblica Amministrazione di implementare misure per la riqualificazione di aree degradate, ma fino ai primi anni '90 la normativa di settore non prevedeva strumenti a tal proposito, e ciò ha portato alla proliferazione di siti estrattivi dismessi e abbandonati senza interventi di recupero ambientale. Il presente lavoro di ricerca fornisce contributi innovativi alla pianificazione e progettazione sostenibile delle attività estrattive, attraverso l'adozione di un approccio multidisciplinare alla trattazione del tema e l'utilizzo esperto dei Sistemi Informativi Geografici, in particolare GRASS GIS. A seguito di una approfondita analisi in merito agli strumenti e le procedure adottate nella pianificazione delle Attività Estrattive in Italia, sono stati sviluppati un metodo di indagine ed un sistema esperto per la previsione ed il controllo delle vibrazioni indotte nel terreno da volate in cava a cielo aperto, che consentono di ottimizzare la progettazione della volata e del sistema di monitoraggio delle vibrazioni grazie a specifici strumenti operativi implementati in GRASS GIS. A supporto di una più efficace programmazione di interventi di riqualificazione territoriale, è stata messa a punto una procedura per la selezione di siti dismessi e di potenziali interventi di riqualificazione, che ottimizza le attività di pianificazione individuando interventi caratterizzati da elevata sostenibilità ambientale, economica e sociale. I risultati ottenuti dimostrano la necessità di un approccio esperto alla pianificazione ed alla progettazione delle attività estrattive, incrementandone la sostenibilità attraverso l'adozione di strumenti operativi più efficienti.
Resumo:
In the context of increasing beam energy and luminosity of the LHC accelerator at CERN, it will be important to accurately measure the Machine Induced Background. A new monitoring system will be installed in the CMS cavern for measuring the beam background at high radius. This detector, called the Beam Halo Monitor, will provide an online, bunch-by-bunch measurement of background induced by beam halo interactions, separately for each beam. The detector is composed of synthetic quartz Cherenkov radiators, coupled to fast UV sensitive photomultiplier tubes. The directional and fast response of the system allows the discrimination of the background particles from the dominant flux in the cavern induced by pp collision debris, produced within the 25 ns bunch spacing. The readout electronics of this detector will make use of many components developed for the upgrade of the CMS Hadron Calorimeter electronics, with a dedicated firmware and readout adapted to the beam monitoring requirements. The PMT signal will be digitized by a charge integrating ASIC, providing both the signal rise time and the charge integrated over one bunch crossing. The backend electronics will record bunch-by-bunch histograms, which will be published to CMS and the LHC using the newly designed CMS beam instrumentation specific DAQ. A calibration and monitoring system has been designed to generate triggered pulses of UV light to monitor the efficiency of the system. The experimental results validating the design of the detector, the calibration system and the electronics will be presented.
Resumo:
Il diabete mellito (DM) è una delle malattie endocrine più comuni nel cane. Una volta raggiunta la diagnosi di DM, è necessario iniziare un trattamento insulinico nonché una dieta specifica, al fine di controllare i livelli di glucosio nel sangue e di conseguenza i segni clinici. Inoltre, al fine di ottenere un buon controllo glicemico, è essenziale garantire uno stretto monitoraggio terapeutico. Nella presente tesi sono riportati numerosi studi relativi a trattamento, monitoraggio e prognosi dei cani con DM. Il capitolo 2 è una review che illustra i principali aspetti terapeutici e di monitoraggio del DM. Il capitolo 3 riporta uno studio che confronta l'efficacia e la sicurezza dell'insulina Lenta e dell'insulina Neutra Protamine Hagedorn (NPH). I metodi di monitoraggio per cani con DM possono essere classificati in diretti od indiretti. I metodi di monitoraggio diretto includono misurazioni serali della glicemia o monitoraggio continuo del glucosio interstiziale tramite appositi dispositivi (Continuous Glucose Monitoring System, CGMS). Le modalità indirette comprendono la valutazione dell'assunzione di acqua e del peso corporeo, la quantificazione del glucosio/chetoni nelle urine e la misurazione delle concentrazioni di proteine glicate. Il capitolo 4 mostra uno studio volto a valutare l'accuratezza e la precisione di un glucometro e un glucometro/chetometro nel cane. Il Flash Glucose Monitoring system è un CGMS recentemente validato per l'uso nel cane; la sua utilità clinica nel monitoraggio del DM canino è esaminata nel capitolo 5. Il capitolo 6 descrive uno studio in cui si validano 2 metodi analitici per la misurazione delle fruttosamine sieriche e dell'emoglobina glicata nel cane e confronta l’utilità delle due proteine glicate nel definire il controllo glicemico. Infine, il capitolo 7 riporta uno studio finalizzato a determinare il tempo di sopravvivenza e ad identificare il valore prognostico di diverse variabili cliniche e clinico-patologiche nei cani con DM.
Resumo:
In recent years, IoT technology has radically transformed many crucial industrial and service sectors such as healthcare. The multi-facets heterogeneity of the devices and the collected information provides important opportunities to develop innovative systems and services. However, the ubiquitous presence of data silos and the poor semantic interoperability in the IoT landscape constitute a significant obstacle in the pursuit of this goal. Moreover, achieving actionable knowledge from the collected data requires IoT information sources to be analysed using appropriate artificial intelligence techniques such as automated reasoning. In this thesis work, Semantic Web technologies have been investigated as an approach to address both the data integration and reasoning aspect in modern IoT systems. In particular, the contributions presented in this thesis are the following: (1) the IoT Fitness Ontology, an OWL ontology that has been developed in order to overcome the issue of data silos and enable semantic interoperability in the IoT fitness domain; (2) a Linked Open Data web portal for collecting and sharing IoT health datasets with the research community; (3) a novel methodology for embedding knowledge in rule-defined IoT smart home scenarios; and (4) a knowledge-based IoT home automation system that supports a seamless integration of heterogeneous devices and data sources.
Resumo:
The enhanced production of strange hadrons in heavy-ion collisions relative to that in minimum-bias pp collisions is historically considered one of the first signatures of the formation of a deconfined quark-gluon plasma. At the LHC, the ALICE experiment observed that the ratio of strange to non-strange hadron yields increases with the charged-particle multiplicity at midrapidity, starting from pp collisions and evolving smoothly across interaction systems and energies, ultimately reaching Pb-Pb collisions. The understanding of the origin of this effect in small systems remains an open question. This thesis presents a comprehensive study of the production of $K^{0}_{S}$, $\Lambda$ ($\bar{\Lambda}$) and $\Xi^{-}$ ($\bar{\Xi}^{+}$) strange hadrons in pp collisions at $\sqrt{s}$ = 13 TeV collected in LHC Run 2 with ALICE. A novel approach is exploited, introducing, for the first time, the concept of effective energy in the study of strangeness production in hadronic collisions at the LHC. In this work, the ALICE Zero Degree Calorimeters are used to measure the energy carried by forward emitted baryons in pp collisions, which reduces the effective energy available for particle production with respect to the nominal centre-of-mass energy. The results presented in this thesis provide new insights into the interplay, for strangeness production, between the initial stages of the collision and the produced final hadronic state. Finally, the first Run 3 results on the production of $\Omega^{\pm}$ ($\bar{\Omega}^{+}$) multi-strange baryons are presented, measured in pp collisions at $\sqrt{s}$ = 13.6 TeV and 900 GeV, the highest and lowest collision energies reached so far at the LHC. This thesis also presents the development and validation of the ALICE Time-Of-Flight (TOF) data quality monitoring system for LHC Run 3. This work was fundamental to assess the performance of the TOF detector during the commissioning phase, in the Long Shutdown 2, and during the data taking period.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.