28 resultados para Mathematical and statistical techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term monitoring of acoustical environments is gaining popularity thanks to the relevant amount of scientific and engineering insights that it provides. The increasing interest is due to the constant growth of storage capacity and computational power to process large amounts of data. In this perspective, machine learning (ML) provides a broad family of data-driven statistical techniques to deal with large databases. Nowadays, the conventional praxis of sound level meter measurements limits the global description of a sound scene to an energetic point of view. The equivalent continuous level Leq represents the main metric to define an acoustic environment, indeed. Finer analyses involve the use of statistical levels. However, acoustic percentiles are based on temporal assumptions, which are not always reliable. A statistical approach, based on the study of the occurrences of sound pressure levels, would bring a different perspective to the analysis of long-term monitoring. Depicting a sound scene through the most probable sound pressure level, rather than portions of energy, brought more specific information about the activity carried out during the measurements. The statistical mode of the occurrences can capture typical behaviors of specific kinds of sound sources. The present work aims to propose an ML-based method to identify, separate and measure coexisting sound sources in real-world scenarios. It is based on long-term monitoring and is addressed to acousticians focused on the analysis of environmental noise in manifold contexts. The presented method is based on clustering analysis. Two algorithms, Gaussian Mixture Model and K-means clustering, represent the main core of a process to investigate different active spaces monitored through sound level meters. The procedure has been applied in two different contexts: university lecture halls and offices. The proposed method shows robust and reliable results in describing the acoustic scenario and it could represent an important analytical tool for acousticians.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Galaxy clusters occupy a special position in the cosmic hierarchy as they are the largest bound structures in the Universe. There is now general agreement on a hierarchical picture for the formation of cosmic structures, in which galaxy clusters are supposed to form by accretion of matter and merging between smaller units. During merger events, shocks are driven by the gravity of the dark matter in the diffuse barionic component, which is heated up to the observed temperature. Radio and hard-X ray observations have discovered non-thermal components mixed with the thermal Intra Cluster Medium (ICM) and this is of great importance as it calls for a “revision” of the physics of the ICM. The bulk of present information comes from the radio observations which discovered an increasing number of Mpcsized emissions from the ICM, Radio Halos (at the cluster center) and Radio Relics (at the cluster periphery). These sources are due to synchrotron emission from ultra relativistic electrons diffusing through µG turbulent magnetic fields. Radio Halos are the most spectacular evidence of non-thermal components in the ICM and understanding the origin and evolution of these sources represents one of the most challenging goal of the theory of the ICM. Cluster mergers are the most energetic events in the Universe and a fraction of the energy dissipated during these mergers could be channelled into the amplification of the magnetic fields and into the acceleration of high energy particles via shocks and turbulence driven by these mergers. Present observations of Radio Halos (and possibly of hard X-rays) can be best interpreted in terms of the reacceleration scenario in which MHD turbulence injected during these cluster mergers re-accelerates high energy particles in the ICM. The physics involved in this scenario is very complex and model details are difficult to test, however this model clearly predicts some simple properties of Radio Halos (and resulting IC emission in the hard X-ray band) which are almost independent of the details of the adopted physics. In particular in the re-acceleration scenario MHD turbulence is injected and dissipated during cluster mergers and thus Radio Halos (and also the resulting hard X-ray IC emission) should be transient phenomena (with a typical lifetime <» 1 Gyr) associated with dynamically disturbed clusters. The physics of the re-acceleration scenario should produce an unavoidable cut-off in the spectrum of the re-accelerated electrons, which is due to the balance between turbulent acceleration and radiative losses. The energy at which this cut-off occurs, and thus the maximum frequency at which synchrotron radiation is produced, depends essentially on the efficiency of the acceleration mechanism so that observations at high frequencies are expected to catch only the most efficient phenomena while, in principle, low frequency radio surveys may found these phenomena much common in the Universe. These basic properties should leave an important imprint in the statistical properties of Radio Halos (and of non-thermal phenomena in general) which, however, have not been addressed yet by present modellings. The main focus of this PhD thesis is to calculate, for the first time, the expected statistics of Radio Halos in the context of the re-acceleration scenario. In particular, we shall address the following main questions: • Is it possible to model “self-consistently” the evolution of these sources together with that of the parent clusters? • How the occurrence of Radio Halos is expected to change with cluster mass and to evolve with redshift? How the efficiency to catch Radio Halos in galaxy clusters changes with the observing radio frequency? • How many Radio Halos are expected to form in the Universe? At which redshift is expected the bulk of these sources? • Is it possible to reproduce in the re-acceleration scenario the observed occurrence and number of Radio Halos in the Universe and the observed correlations between thermal and non-thermal properties of galaxy clusters? • Is it possible to constrain the magnetic field intensity and profile in galaxy clusters and the energetic of turbulence in the ICM from the comparison between model expectations and observations? Several astrophysical ingredients are necessary to model the evolution and statistical properties of Radio Halos in the context of re-acceleration model and to address the points given above. For these reason we deserve some space in this PhD thesis to review the important aspects of the physics of the ICM which are of interest to catch our goals. In Chapt. 1 we discuss the physics of galaxy clusters, and in particular, the clusters formation process; in Chapt. 2 we review the main observational properties of non-thermal components in the ICM; and in Chapt. 3 we focus on the physics of magnetic field and of particle acceleration in galaxy clusters. As a relevant application, the theory of Alfv´enic particle acceleration is applied in Chapt. 4 where we report the most important results from calculations we have done in the framework of the re-acceleration scenario. In this Chapter we show that a fraction of the energy of fluid turbulence driven in the ICM by the cluster mergers can be channelled into the injection of Alfv´en waves at small scales and that these waves can efficiently re-accelerate particles and trigger Radio Halos and hard X-ray emission. The main part of this PhD work, the calculation of the statistical properties of Radio Halos and non-thermal phenomena as expected in the context of the re-acceleration model and their comparison with observations, is presented in Chapts.5, 6, 7 and 8. In Chapt.5 we present a first approach to semi-analytical calculations of statistical properties of giant Radio Halos. The main goal of this Chapter is to model cluster formation, the injection of turbulence in the ICM and the resulting particle acceleration process. We adopt the semi–analytic extended Press & Schechter (PS) theory to follow the formation of a large synthetic population of galaxy clusters and assume that during a merger a fraction of the PdV work done by the infalling subclusters in passing through the most massive one is injected in the form of magnetosonic waves. Then the processes of stochastic acceleration of the relativistic electrons by these waves and the properties of the ensuing synchrotron (Radio Halos) and inverse Compton (IC, hard X-ray) emission of merging clusters are computed under the assumption of a constant rms average magnetic field strength in emitting volume. The main finding of these calculations is that giant Radio Halos are naturally expected only in the more massive clusters, and that the expected fraction of clusters with Radio Halos is consistent with the observed one. In Chapt. 6 we extend the previous calculations by including a scaling of the magnetic field strength with cluster mass. The inclusion of this scaling allows us to derive the expected correlations between the synchrotron radio power of Radio Halos and the X-ray properties (T, LX) and mass of the hosting clusters. For the first time, we show that these correlations, calculated in the context of the re-acceleration model, are consistent with the observed ones for typical µG strengths of the average B intensity in massive clusters. The calculations presented in this Chapter allow us to derive the evolution of the probability to form Radio Halos as a function of the cluster mass and redshift. The most relevant finding presented in this Chapter is that the luminosity functions of giant Radio Halos at 1.4 GHz are expected to peak around a radio power » 1024 W/Hz and to flatten (or cut-off) at lower radio powers because of the decrease of the electron re-acceleration efficiency in smaller galaxy clusters. In Chapt. 6 we also derive the expected number counts of Radio Halos and compare them with available observations: we claim that » 100 Radio Halos in the Universe can be observed at 1.4 GHz with deep surveys, while more than 1000 Radio Halos are expected to be discovered in the next future by LOFAR at 150 MHz. This is the first (and so far unique) model expectation for the number counts of Radio Halos at lower frequency and allows to design future radio surveys. Based on the results of Chapt. 6, in Chapt.7 we present a work in progress on a “revision” of the occurrence of Radio Halos. We combine past results from the NVSS radio survey (z » 0.05 − 0.2) with our ongoing GMRT Radio Halos Pointed Observations of 50 X-ray luminous galaxy clusters (at z » 0.2−0.4) and discuss the possibility to test our model expectations with the number counts of Radio Halos at z » 0.05 − 0.4. The most relevant limitation in the calculations presented in Chapt. 5 and 6 is the assumption of an “averaged” size of Radio Halos independently of their radio luminosity and of the mass of the parent clusters. This assumption cannot be released in the context of the PS formalism used to describe the formation process of clusters, while a more detailed analysis of the physics of cluster mergers and of the injection process of turbulence in the ICM would require an approach based on numerical (possible MHD) simulations of a very large volume of the Universe which is however well beyond the aim of this PhD thesis. On the other hand, in Chapt.8 we report our discovery of novel correlations between the size (RH) of Radio Halos and their radio power and between RH and the cluster mass within the Radio Halo region, MH. In particular this last “geometrical” MH − RH correlation allows us to “observationally” overcome the limitation of the “average” size of Radio Halos. Thus in this Chapter, by making use of this “geometrical” correlation and of a simplified form of the re-acceleration model based on the results of Chapt. 5 and 6 we are able to discuss expected correlations between the synchrotron power and the thermal cluster quantities relative to the radio emitting region. This is a new powerful tool of investigation and we show that all the observed correlations (PR − RH, PR − MH, PR − T, PR − LX, . . . ) now become well understood in the context of the re-acceleration model. In addition, we find that observationally the size of Radio Halos scales non-linearly with the virial radius of the parent cluster, and this immediately means that the fraction of the cluster volume which is radio emitting increases with cluster mass and thus that the non-thermal component in clusters is not self-similar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is focused on the metabolomic study of human cancer tissues by ex vivo High Resolution-Magic Angle Spinning (HR-MAS) nuclear magnetic resonance (NMR) spectroscopy. This new technique allows for the acquisition of spectra directly on intact tissues (biopsy or surgery), and it has become very important for integrated metabonomics studies. The objective is to identify metabolites that can be used as markers for the discrimination of the different types of cancer, for the grading, and for the assessment of the evolution of the tumour. Furthermore, an attempt to recognize metabolites, that although involved in the metabolism of tumoral tissues in low concentration, can be important modulators of neoplastic proliferation, was performed. In addition, NMR data was integrated with statistical techniques in order to obtain semi-quantitative information about the metabolite markers. In the case of gliomas, the NMR study was correlated with gene expression of neoplastic tissues. Chapter 1 begins with a general description of a new “omics” study, the metabolomics. The study of metabolism can contribute significantly to biomedical research and, ultimately, to clinical medical practice. This rapidly developing discipline involves the study of the metabolome: the total repertoire of small molecules present in cells, tissues, organs, and biological fluids. Metabolomic approaches are becoming increasingly popular in disease diagnosis and will play an important role on improving our understanding of cancer mechanism. Chapter 2 addresses in more detail the basis of NMR Spectroscopy, presenting the new HR-MAS NMR tool, that is gaining importance in the examination of tumour tissues, and in the assessment of tumour grade. Some advanced chemometric methods were used in an attempt to enhance the interpretation and quantitative information of the HR-MAS NMR data are and presented in chapter 3. Chemometric methods seem to have a high potential in the study of human diseases, as it permits the extraction of new and relevant information from spectroscopic data, allowing a better interpretation of the results. Chapter 4 reports results obtained from HR-MAS NMR analyses performed on different brain tumours: medulloblastoma, meningioms and gliomas. The medulloblastoma study is a case report of primitive neuroectodermal tumor (PNET) localised in the cerebellar region by Magnetic Resonance Imaging (MRI) in a 3-year-old child. In vivo single voxel 1H MRS shows high specificity in detecting the main metabolic alterations in the primitive cerebellar lesion; which consist of very high amounts of the choline-containing compounds and of very low levels of creatine derivatives and N-acetylaspartate. Ex vivo HR-MAS NMR, performed at 9.4 Tesla on the neoplastic specimen collected during surgery, allows the unambiguous identification of several metabolites giving a more in-depth evaluation of the metabolic pattern of the lesion. The ex vivo HR-MAS NMR spectra show higher detail than that obtained in vivo. In addition, the spectroscopic data appear to correlate with some morphological features of the medulloblastoma. The present study shows that ex vivo HR-MAS 1H NMR is able to strongly improve the clinical possibility of in vivo MRS and can be used in conjunction with in vivo spectroscopy for clinical purposes. Three histological subtypes of meningiomas (meningothelial, fibrous and oncocytic) were analysed both by in vivo and ex vivo MRS experiments. The ex vivo HR-MAS investigations are very helpful for the assignment of the in vivo resonances of human meningiomas and for the validation of the quantification procedure of in vivo MR spectra. By using one- and two dimensional experiments, several metabolites in different histological subtypes of meningiomas, were identified. The spectroscopic data confirmed the presence of the typical metabolites of these benign neoplasms and, at the same time, that meningomas with different morphological characteristics have different metabolic profiles, particularly regarding macromolecules and lipids. The profile of total choline metabolites (tCho) and the expression of the Kennedy pathway genes in biopsies of human gliomas were also investigated using HR-MAS NMR, and microfluidic genomic cards. 1H HR-MAS spectra, allowed the resolution and relative quantification by LCModel of the resonances from choline (Cho), phosphorylcholine (PC) and glycerolphorylcholine (GPC), the three main components of the combined tCho peak observed in gliomas by in vivo 1H MRS spectroscopy. All glioma biopsies depicted an increase in tCho as calculated from the addition of Cho, PC and GPC HR-MAS resonances. However, the increase was constantly derived from augmented GPC in low grade NMR gliomas or increased PC content in the high grade gliomas, respectively. This circumstance allowed the unambiguous discrimination of high and low grade gliomas by 1H HR-MAS, which could not be achieved by calculating the tCho/Cr ratio commonly used by in vivo 1H MR spectroscopy. The expression of the genes involved in choline metabolism was investigated in the same biopsies. The present findings offer a convenient procedure to classify accurately glioma grade using 1H HR-MAS, providing in addition the genetic background for the alterations of choline metabolism observed in high and low gliomas grade. Chapter 5 reports the study on human gastrointestinal tract (stomach and colon) neoplasms. The human healthy gastric mucosa, and the characteristics of the biochemical profile of human gastric adenocarcinoma in comparison with that of healthy gastric mucosa were analyzed using ex vivo HR-MAS NMR. Healthy human mucosa is mainly characterized by the presence of small metabolites (more than 50 identified) and macromolecules. The adenocarcinoma spectra were dominated by the presence of signals due to triglycerides, that are usually very low in healthy gastric mucosa. The use of spin-echo experiments enable us to detect some metabolites in the unhealthy tissues and to determine their variation with respect to the healthy ones. Then, the ex vivo HR-MAS NMR analysis was applied to human gastric tissue, to obtain information on the molecular steps involved in the gastric carcinogenesis. A microscopic investigation was also carried out in order to identify and locate the lipids in the cellular and extra-cellular environments. Correlation of the morphological changes detected by transmission (TEM) and scanning (SEM) electron microscopy, with the metabolic profile of gastric mucosa in healthy, gastric atrophy autoimmune diseases (AAG), Helicobacter pylori-related gastritis and adenocarcinoma subjects, were obtained. These ultrastructural studies of AAG and gastric adenocarcinoma revealed lipid intra- and extra-cellularly accumulation associated with a severe prenecrotic hypoxia and mitochondrial degeneration. A deep insight into the metabolic profile of human healthy and neoplastic colon tissues was gained using ex vivo HR-MAS NMR spectroscopy in combination with multivariate methods: Principal Component Analysis (PCA) and Partial Least Squares Discriminant Analysis (PLS-DA). The NMR spectra of healthy tissues highlight different metabolic profiles with respect to those of neoplastic and microscopically normal colon specimens (these last obtained at least 15 cm far from the adenocarcinoma). Furthermore, metabolic variations are detected not only for neoplastic tissues with different histological diagnosis, but also for those classified identical by histological analysis. These findings suggest that the same subclass of colon carcinoma is characterized, at a certain degree, by metabolic heterogeneity. The statistical multivariate approach applied to the NMR data is crucial in order to find metabolic markers of the neoplastic state of colon tissues, and to correctly classify the samples. Significant different levels of choline containing compounds, taurine and myoinositol, were observed. Chapter 6 deals with the metabolic profile of normal and tumoral renal human tissues obtained by ex vivo HR-MAS NMR. The spectra of human normal cortex and medulla show the presence of differently distributed osmolytes as markers of physiological renal condition. The marked decrease or disappearance of these metabolites and the high lipid content (triglycerides and cholesteryl esters) is typical of clear cell renal carcinoma (RCC), while papillary RCC is characterized by the absence of lipids and very high amounts of taurine. This research is a contribution to the biochemical classification of renal neoplastic pathologies, especially for RCCs, which can be evaluated by in vivo MRS for clinical purposes. Moreover, these data help to gain a better knowledge of the molecular processes envolved in the onset of renal carcinogenesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project was to achieve a deep understanding of the mechanisms by which Baltic amber degrades, in order to develop techniques for preventive conservation of archaeological amber objects belonging to the National Museum of Denmark’s collections. To examine deterioration of Baltic amber, a starting point was to identify and monitor surface and bulk properties which are affected during degradation. The way to operate consisted of the use of accelerated ageing to initiate degradation of raw Baltic amber samples in different conditions of relative humidity, oxygen exposure or pH and, successively, of the use of non/micro-destructive techniques to identify and quantify changes in visual, chemical and structural properties. A large piece of raw Baltic amber was used to prepare several test samples for two different kinds of accelerated ageing: thermal-ageing and photo-ageing. During the ageing, amber samples were regularly examined through several analytical techniques related to different information: appearance/colour change by visual examination, photography and colorimetry; chemical change by infrared spectroscopy, Raman spectroscopy and elemental analysis; rate of oxidation by oxygen measurement; qualitative analysis of released volatiles by gas chromatography – mass spectrometry. The obtained results were analysed through both critical evaluation and statistical study. After the interpretation of the achieved data, the main relations between amber and environmental factors during the degradation process became clearer and it was possible to identify the major pathways by which amber degrades, such as hydrolysis of esters into alcohols and carboxylic acids, thermal-oxidation and photo-oxidation of terpenoid components, depolymerisation and decomposition of the chemical structure. At the end it was possible to suggest a preventive conservation strategy based on the control of climatic, atmospheric and lighting parameters in the environment where Baltic amber objects are stored and displayed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tumors involving bone and soft tissues are extremely challenging situations. With the recent advances of multi-modal treatment, not only the type of surgery has moved from amputation to limb-sparing procedures, but also the survivorship has improved considerably and reconstructive techniques have the goal to allow a considerably higher quality of life. In bone reconstruction, tissue engineering strategies are the main area of research. Re-vascularization and re-vitalisation of a massive allograft would considerably improve the outcome of biological reconstructions. Using a rabbit animal model, in this study we showed that, by implanting a vascular pedicle inside a weight bearing massive cortical allograft, the bone regeneration inside the allograft was higher compared to the non-vascularized implants, given the patency of the vascular pedicle. Improvement in the animal model and the addition of Stem Cells and Growth factors will allow a further improvement in the results. In soft tissue tumors, free and pedicled flaps have been proven to be of great help as reconstruction strategies. In this study we analyzed the functional and overall outcome of 14 patients who received a re-innervated vascularized flap. We have demonstrated that the use of the innovative technique of motor re-innervated muscular flaps is effective when the resection involves important functional compartments of the upper or lower limb, with no increase of post-operative complications. Although there was no direct comparison between this type of reconstruction and the standard non-innervated reconstruction, we underlined the remarkable high overall functional scores and patient satisfaction following this procedure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis aims to expose the advances achieved in the practices of captive breeding of the European eel (Anguilla anguilla). Aspects investigated concern both approaches livestock (breeding selection, response to hormonal stimulation, reproductive performance, incubation of eggs) and physiological aspects (endocrine plasma profiles of players), as well as engineering aspects. Studies conducted on various populations of wild eel have shown that the main determining factor in the selection of wild females destined to captive breeding must be the Silver Index which may determine the stage of pubertal development. The hormonal induction protocol adopted, with increasing doses of carp pituitary extract, it has proven useful to ovarian development, with a synchronization effect that is positively reflected on egg production. The studies on the effects of photoperiod show how the condition of total darkness can positively influence practices of reproductions in captivity. The effects of photoperiod were also investigated at the physiological level, observing the plasma levels of steroids ( E2, T) and thyroid hormones (T3 and T4) and the expression in the liver of vitellogenin (vtg1 and vtg2) and estradiol membrane receptor (ESR1). From the comparison between spontaneous deposition and insemination techniques through the stripping is inferred as the first ports to a better qualitative and quantitative yield in the production of eggs capable of being fertilized, also the presence of a percentage of oocytes completely transparent can be used to obtain eggs at a good rate of fertility. Finally, the design and implementation of a system for recirculating aquaculture suited to meet the needs of species-specific eel showed how to improve the reproductive results, it would be preferable to adopt low-flow and low density incubation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information is nowadays a key resource: machine learning and data mining techniques have been developed to extract high-level information from great amounts of data. As most data comes in form of unstructured text in natural languages, research on text mining is currently very active and dealing with practical problems. Among these, text categorization deals with the automatic organization of large quantities of documents in priorly defined taxonomies of topic categories, possibly arranged in large hierarchies. In commonly proposed machine learning approaches, classifiers are automatically trained from pre-labeled documents: they can perform very accurate classification, but often require a consistent training set and notable computational effort. Methods for cross-domain text categorization have been proposed, allowing to leverage a set of labeled documents of one domain to classify those of another one. Most methods use advanced statistical techniques, usually involving tuning of parameters. A first contribution presented here is a method based on nearest centroid classification, where profiles of categories are generated from the known domain and then iteratively adapted to the unknown one. Despite being conceptually simple and having easily tuned parameters, this method achieves state-of-the-art accuracy in most benchmark datasets with fast running times. A second, deeper contribution involves the design of a domain-independent model to distinguish the degree and type of relatedness between arbitrary documents and topics, inferred from the different types of semantic relationships between respective representative words, identified by specific search algorithms. The application of this model is tested on both flat and hierarchical text categorization, where it potentially allows the efficient addition of new categories during classification. Results show that classification accuracy still requires improvements, but models generated from one domain are shown to be effectively able to be reused in a different one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The idea behind the project is to develop a methodology for analyzing and developing techniques for the diagnosis and the prediction of the state of charge and health of lithium-ion batteries for automotive applications. For lithium-ion batteries, residual functionality is measured in terms of state of health; however, this value cannot be directly associated with a measurable value, so it must be estimated. The development of the algorithms is based on the identification of the causes of battery degradation, in order to model and predict the trend. Therefore, models have been developed that are able to predict the electrical, thermal and aging behavior. In addition to the model, it was necessary to develop algorithms capable of monitoring the state of the battery, online and offline. This was possible with the use of algorithms based on Kalman filters, which allow the estimation of the system status in real time. Through machine learning algorithms, which allow offline analysis of battery deterioration using a statistical approach, it is possible to analyze information from the entire fleet of vehicles. Both systems work in synergy in order to achieve the best performance. Validation was performed with laboratory tests on different batteries and under different conditions. The development of the model allowed to reduce the time of the experimental tests. Some specific phenomena were tested in the laboratory, and the other cases were artificially generated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Noise is constant presence in measurements. Its origin is related to the microscopic properties of matter. Since the seminal work of Brown in 1828, the study of stochastic processes has gained an increasing interest with the development of new mathematical and analytical tools. In the last decades, the central role that noise plays in chemical and physiological processes has become recognized. The dual role of noise as nuisance/resource pushes towards the development of new decomposition techniques that divide a signal into its deterministic and stochastic components. In this thesis I show how methods based on Singular Spectrum Analysis have the right properties to fulfil the previously mentioned requirement. During my work I applied SSA to different signals of interest in chemistry: I developed a novel iterative procedure for the denoising of powder X-ray diffractograms; I “denoised” bi-dimensional images from experiments of electrochemiluminescence imaging of micro-beads obtaining new insight on ECL mechanism. I also used Principal Component Analysis to investigate the relationship between brain electrophysiological signals and voice emission.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of new technologies it is increasingly easier to find data of different nature from even more accurate sensors that measure the most disparate physical quantities and with different methodologies. The collection of data thus becomes progressively important and takes the form of archiving, cataloging and online and offline consultation of information. Over time, the amount of data collected can become so relevant that it contains information that cannot be easily explored manually or with basic statistical techniques. The use of Big Data therefore becomes the object of more advanced investigation techniques, such as Machine Learning and Deep Learning. In this work some applications in the world of precision zootechnics and heat stress accused by dairy cows are described. Experimental Italian and German stables were involved for the training and testing of the Random Forest algorithm, obtaining a prediction of milk production depending on the microclimatic conditions of the previous days with satisfactory accuracy. Furthermore, in order to identify an objective method for identifying production drops, compared to the Wood model, typically used as an analytical model of the lactation curve, a Robust Statistics technique was used. Its application on some sample lactations and the results obtained allow us to be confident about the use of this method in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today’s data are increasingly complex and classical statistical techniques need growingly more refined mathematical tools to be able to model and investigate them. Paradigmatic situations are represented by data which need to be considered up to some kind of trans- formation and all those circumstances in which the analyst finds himself in the need of defining a general concept of shape. Topological Data Analysis (TDA) is a field which is fundamentally contributing to such challenges by extracting topological information from data with a plethora of interpretable and computationally accessible pipelines. We con- tribute to this field by developing a series of novel tools, techniques and applications to work with a particular topological summary called merge tree. To analyze sets of merge trees we introduce a novel metric structure along with an algorithm to compute it, define a framework to compare different functions defined on merge trees and investigate the metric space obtained with the aforementioned metric. Different geometric and topolog- ical properties of the space of merge trees are established, with the aim of obtaining a deeper understanding of such trees. To showcase the effectiveness of the proposed metric, we develop an application in the field of Functional Data Analysis, working with functions up to homeomorphic reparametrization, and in the field of radiomics, where each patient is represented via a clustering dendrogram.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral sensors are a wide class of devices that are extremely useful for detecting essential information of the environment and materials with high degree of selectivity. Recently, they have achieved high degrees of integration and low implementation cost to be suited for fast, small, and non-invasive monitoring systems. However, the useful information is hidden in spectra and it is difficult to decode. So, mathematical algorithms are needed to infer the value of the variables of interest from the acquired data. Between the different families of predictive modeling, Principal Component Analysis and the techniques stemmed from it can provide very good performances, as well as small computational and memory requirements. For these reasons, they allow the implementation of the prediction even in embedded and autonomous devices. In this thesis, I will present 4 practical applications of these algorithms to the prediction of different variables: moisture of soil, moisture of concrete, freshness of anchovies/sardines, and concentration of gasses. In all of these cases, the workflow will be the same. Initially, an acquisition campaign was performed to acquire both spectra and the variables of interest from samples. Then these data are used as input for the creation of the prediction models, to solve both classification and regression problems. From these models, an array of calibration coefficients is derived and used for the implementation of the prediction in an embedded system. The presented results will show that this workflow was successfully applied to very different scientific fields, obtaining autonomous and non-invasive devices able to predict the value of physical parameters of choice from new spectral acquisitions.