963 resultados para HR-XML recruiting specification
Resumo:
Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.
Resumo:
This thesis is focused on the metabolomic study of human cancer tissues by ex vivo High Resolution-Magic Angle Spinning (HR-MAS) nuclear magnetic resonance (NMR) spectroscopy. This new technique allows for the acquisition of spectra directly on intact tissues (biopsy or surgery), and it has become very important for integrated metabonomics studies. The objective is to identify metabolites that can be used as markers for the discrimination of the different types of cancer, for the grading, and for the assessment of the evolution of the tumour. Furthermore, an attempt to recognize metabolites, that although involved in the metabolism of tumoral tissues in low concentration, can be important modulators of neoplastic proliferation, was performed. In addition, NMR data was integrated with statistical techniques in order to obtain semi-quantitative information about the metabolite markers. In the case of gliomas, the NMR study was correlated with gene expression of neoplastic tissues. Chapter 1 begins with a general description of a new “omics” study, the metabolomics. The study of metabolism can contribute significantly to biomedical research and, ultimately, to clinical medical practice. This rapidly developing discipline involves the study of the metabolome: the total repertoire of small molecules present in cells, tissues, organs, and biological fluids. Metabolomic approaches are becoming increasingly popular in disease diagnosis and will play an important role on improving our understanding of cancer mechanism. Chapter 2 addresses in more detail the basis of NMR Spectroscopy, presenting the new HR-MAS NMR tool, that is gaining importance in the examination of tumour tissues, and in the assessment of tumour grade. Some advanced chemometric methods were used in an attempt to enhance the interpretation and quantitative information of the HR-MAS NMR data are and presented in chapter 3. Chemometric methods seem to have a high potential in the study of human diseases, as it permits the extraction of new and relevant information from spectroscopic data, allowing a better interpretation of the results. Chapter 4 reports results obtained from HR-MAS NMR analyses performed on different brain tumours: medulloblastoma, meningioms and gliomas. The medulloblastoma study is a case report of primitive neuroectodermal tumor (PNET) localised in the cerebellar region by Magnetic Resonance Imaging (MRI) in a 3-year-old child. In vivo single voxel 1H MRS shows high specificity in detecting the main metabolic alterations in the primitive cerebellar lesion; which consist of very high amounts of the choline-containing compounds and of very low levels of creatine derivatives and N-acetylaspartate. Ex vivo HR-MAS NMR, performed at 9.4 Tesla on the neoplastic specimen collected during surgery, allows the unambiguous identification of several metabolites giving a more in-depth evaluation of the metabolic pattern of the lesion. The ex vivo HR-MAS NMR spectra show higher detail than that obtained in vivo. In addition, the spectroscopic data appear to correlate with some morphological features of the medulloblastoma. The present study shows that ex vivo HR-MAS 1H NMR is able to strongly improve the clinical possibility of in vivo MRS and can be used in conjunction with in vivo spectroscopy for clinical purposes. Three histological subtypes of meningiomas (meningothelial, fibrous and oncocytic) were analysed both by in vivo and ex vivo MRS experiments. The ex vivo HR-MAS investigations are very helpful for the assignment of the in vivo resonances of human meningiomas and for the validation of the quantification procedure of in vivo MR spectra. By using one- and two dimensional experiments, several metabolites in different histological subtypes of meningiomas, were identified. The spectroscopic data confirmed the presence of the typical metabolites of these benign neoplasms and, at the same time, that meningomas with different morphological characteristics have different metabolic profiles, particularly regarding macromolecules and lipids. The profile of total choline metabolites (tCho) and the expression of the Kennedy pathway genes in biopsies of human gliomas were also investigated using HR-MAS NMR, and microfluidic genomic cards. 1H HR-MAS spectra, allowed the resolution and relative quantification by LCModel of the resonances from choline (Cho), phosphorylcholine (PC) and glycerolphorylcholine (GPC), the three main components of the combined tCho peak observed in gliomas by in vivo 1H MRS spectroscopy. All glioma biopsies depicted an increase in tCho as calculated from the addition of Cho, PC and GPC HR-MAS resonances. However, the increase was constantly derived from augmented GPC in low grade NMR gliomas or increased PC content in the high grade gliomas, respectively. This circumstance allowed the unambiguous discrimination of high and low grade gliomas by 1H HR-MAS, which could not be achieved by calculating the tCho/Cr ratio commonly used by in vivo 1H MR spectroscopy. The expression of the genes involved in choline metabolism was investigated in the same biopsies. The present findings offer a convenient procedure to classify accurately glioma grade using 1H HR-MAS, providing in addition the genetic background for the alterations of choline metabolism observed in high and low gliomas grade. Chapter 5 reports the study on human gastrointestinal tract (stomach and colon) neoplasms. The human healthy gastric mucosa, and the characteristics of the biochemical profile of human gastric adenocarcinoma in comparison with that of healthy gastric mucosa were analyzed using ex vivo HR-MAS NMR. Healthy human mucosa is mainly characterized by the presence of small metabolites (more than 50 identified) and macromolecules. The adenocarcinoma spectra were dominated by the presence of signals due to triglycerides, that are usually very low in healthy gastric mucosa. The use of spin-echo experiments enable us to detect some metabolites in the unhealthy tissues and to determine their variation with respect to the healthy ones. Then, the ex vivo HR-MAS NMR analysis was applied to human gastric tissue, to obtain information on the molecular steps involved in the gastric carcinogenesis. A microscopic investigation was also carried out in order to identify and locate the lipids in the cellular and extra-cellular environments. Correlation of the morphological changes detected by transmission (TEM) and scanning (SEM) electron microscopy, with the metabolic profile of gastric mucosa in healthy, gastric atrophy autoimmune diseases (AAG), Helicobacter pylori-related gastritis and adenocarcinoma subjects, were obtained. These ultrastructural studies of AAG and gastric adenocarcinoma revealed lipid intra- and extra-cellularly accumulation associated with a severe prenecrotic hypoxia and mitochondrial degeneration. A deep insight into the metabolic profile of human healthy and neoplastic colon tissues was gained using ex vivo HR-MAS NMR spectroscopy in combination with multivariate methods: Principal Component Analysis (PCA) and Partial Least Squares Discriminant Analysis (PLS-DA). The NMR spectra of healthy tissues highlight different metabolic profiles with respect to those of neoplastic and microscopically normal colon specimens (these last obtained at least 15 cm far from the adenocarcinoma). Furthermore, metabolic variations are detected not only for neoplastic tissues with different histological diagnosis, but also for those classified identical by histological analysis. These findings suggest that the same subclass of colon carcinoma is characterized, at a certain degree, by metabolic heterogeneity. The statistical multivariate approach applied to the NMR data is crucial in order to find metabolic markers of the neoplastic state of colon tissues, and to correctly classify the samples. Significant different levels of choline containing compounds, taurine and myoinositol, were observed. Chapter 6 deals with the metabolic profile of normal and tumoral renal human tissues obtained by ex vivo HR-MAS NMR. The spectra of human normal cortex and medulla show the presence of differently distributed osmolytes as markers of physiological renal condition. The marked decrease or disappearance of these metabolites and the high lipid content (triglycerides and cholesteryl esters) is typical of clear cell renal carcinoma (RCC), while papillary RCC is characterized by the absence of lipids and very high amounts of taurine. This research is a contribution to the biochemical classification of renal neoplastic pathologies, especially for RCCs, which can be evaluated by in vivo MRS for clinical purposes. Moreover, these data help to gain a better knowledge of the molecular processes envolved in the onset of renal carcinogenesis.
Resumo:
Nello sviluppo di sistemi informatici si sono affermate numerose tecnologie, che vanno utilizzate in modo combinato e, possibilmente sinergico. Da una parte, i sistemi di gestione di basi di dati relazionali consentono una gestione efficiente ed efficace di dati persistenti, condivisi e transazionali. Dall'altra, gli strumenti e i metodi orientati agli oggetti (linguaggi di programmazione, ma anche metodologie di analisi e progettazione) consentono uno sviluppo efficace della logica applicativa delle applicazioni. E’ utile in questo contesto spiegare che cosa s'intende per sistema informativo e sistema informatico. Sistema informativo: L'insieme di persone, risorse tecnologiche, procedure aziendali il cui compito è quello di produrre e conservare le informazioni che servono per operare nell'impresa e gestirla. Sistema informatico: L'insieme degli strumenti informatici utilizzati per il trattamento automatico delle informazioni, al fine di agevolare le funzioni del sistema informativo. Ovvero, il sistema informatico raccoglie, elabora, archivia, scambia informazione mediante l'uso delle tecnologie proprie dell'Informazione e della Comunicazione (ICT): calcolatori, periferiche, mezzi di comunicazione, programmi. Il sistema informatico è quindi un componente del sistema informativo. Le informazioni ottenute dall'elaborazione dei dati devono essere salvate da qualche parte, in modo tale da durare nel tempo dopo l'elaborazione. Per realizzare questo scopo viene in aiuto l'informatica. I dati sono materiale informativo grezzo, non (ancora) elaborato da chi lo riceve, e possono essere scoperti, ricercati, raccolti e prodotti. Sono la materia prima che abbiamo a disposizione o produciamo per costruire i nostri processi comunicativi. L'insieme dei dati è il tesoro di un'azienda e ne rappresenta la storia evolutiva. All'inizio di questa introduzione è stato accennato che nello sviluppo dei sistemi informatici si sono affermate diverse tecnologie e che, in particolare, l'uso di sistemi di gestione di basi di dati relazionali comporta una gestione efficace ed efficiente di dati persistenti. Per persistenza di dati in informatica si intende la caratteristica dei dati di sopravvivere all'esecuzione del programma che li ha creati. Se non fosse cosi, i dati verrebbero salvati solo in memoria RAM e sarebbero persi allo spegnimento del computer. Nella programmazione informatica, per persistenza si intende la possibilità di far sopravvivere strutture dati all'esecuzione di un programma singolo. Occorre il salvataggio in un dispositivo di memorizzazione non volatile, come per esempio su un file system o su un database. In questa tesi si è sviluppato un sistema che è in grado di gestire una base di dati gerarchica o relazionale consentendo l'importazione di dati descritti da una grammatica DTD. Nel capitolo 1 si vedranno più in dettaglio cosa di intende per Sistema Informativo, modello client-server e sicurezza dei dati. Nel capitolo 2 parleremo del linguaggio di programmazione Java, dei database e dei file XML. Nel capitolo 3 descriveremo un linguaggio di analisi e modellazione UML con esplicito riferimento al progetto sviluppato. Nel capitolo 4 descriveremo il progetto che è stato implementato e le tecnologie e tools utilizzati.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
Investigations on formation and specification of neural precursor cells in the central nervous system of the Drosophila melanogaster embryoSpecification of a unique cell fate during development of a multicellular organism often is a function of its position. The Drosophila central nervous system (CNS) provides an ideal system to dissect signalling events during development that lead to cell specific patterns. Different cell types in the CNS are formed from a relatively few precursor cells, the neuroblasts (NBs), which delaminate from the neurogenic region of the ectoderm. The delamination occurs in five waves, S1-S5, finally leading to a subepidermal layer consisting of about 30 NBs, each with a unique identity, arranged in a stereotyped spatial pattern in each hemisegment. This information depends on several factors such as the concentrations of various morphogens, cell-cell interactions and long range signals present at the position and time of its birth. The early NBs, delaminating during S1 and S2, form an orthogonal array of four rows (2/3,4,5,6/7) and three columns (medial, intermediate, and lateral) . However, the three column and four row-arrangement pattern is only transitory during early stages of neurogenesis which is obscured by late emerging (S3-S5) neuroblasts (Doe and Goodman, 1985; Goodman and Doe, 1993). Therefore the aim of my study has been to identify novel genes which play a role in the formation or specification of late delaminating NBs.In this study the gene anterior open or yan was picked up in a genetic screen to identity novel and yet unidentified genes in the process of late neuroblast formation and specification. I have shown that the gene yan is responsible for maintaining the cells of the neuroectoderm in an undifferentiated state by interfering with the Notch signalling mechanism. Secondly, I have studied the function and interactions of segment polarity genes within a certain neuroectodermal region, namely the engrailed (en) expressing domain, with regard to the fate specification of a set of late neuroblasts, namely NB 6-4 and NB 7-3. I have dissected the regulatory interaction of the segment polarity genes wingless (wg), hedgehog (hh) and engrailed (en) as they maintain each others expression to show that En is a prerequisite for neurogenesis and show that the interplay of the segmentation genes naked (nkd) and gooseberry (gsb), both of which are targets of wingless (wg) activity, leads to differential commitment of NB 7-3 and NB 6-4 cell fate. I have shown that in the absence of either nkd or gsb one NB fate is replaced by the other. However, the temporal sequence of delamination is maintained, suggesting that formation and specification of these two NBs are under independent control.
Resumo:
Con il seguente elaborato propongo di presentare il lavoro svolto sui documenti XML che ci sono stati forniti. Più nello specifico, il lavoro è incentrato sui riferimenti bibliografici presenti in ogni documento e ha come fine l'elaborazione delle informazioni estrapolate al fine di poterle esportare nel formato RDF (Resource Description Framework). I documenti XML (eXtensible Markup Language) fornitimi provengono dalla casa editrice Elsevier, una delle più grandi case editrici di articoli scientifici organizzati in riviste specializzate (journal).
Resumo:
Il successo di XML ha rinnovato l'interesse per il controllo delle modifiche sugli alberi e i dati semi-strutturati. Le necessità principali sono gestire le revisioni dei documenti, interrogare e monitorare i cambiamenti e scambiare efficientemente i documenti e i loro aggiornamenti. I cambiamenti che si verificano tra due versioni di un documento sono sconosciuti al sistema. Quindi, un algoritmo di diffing viene utilizzato per costruire un delta che rappresenta i cambiamenti. Sono stati proposti vari algoritmi di diffing. Alcuni considerano la struttura ad albero dei documenti XML, mentre altri non lo fanno. Inoltre, alcuni algoritmi possono trovare una sequenza più "sintetica" delle modifiche. Questo migliora la qualità del monitoraggio e l'interrogazione delle modifiche. Esistono altri approcci sviluppati per monitorare i cambiamenti sui documenti XML, differenti dagli algoritmi di diffing, ma che comunque ottengono risultati quasi identici ed offrono un'interrogazione delle modifiche più agevole per gli utenti umani. Esistono infatti programmi di editing con strumenti di change tracking, che permettono a più autori di modificare diverse versioni dei documenti contemporaneamente e registrando in tempo reale tutti i cambiamenti da loro apportati. In questo lavoro studio i diversi strumenti e confronto i loro risultati sulla base di esperimenti condotti su documenti XML opportunamente modificati per riconoscere determinati cambiamenti. Ci sono anche diverse proposte di formati del delta per rappresentare i cambiamenti in XML, ma non vi è ancora alcuno standard. Espongo le principali proposte in base alle loro specifiche, le loro implementazioni e sui risultati degli esperimenti condotti. L'obiettivo è di fornire una valutazione della qualità degli strumenti e, sulla base di questo, guidare gli utenti nella scelta della soluzione appropriata per le loro applicazioni.
Resumo:
Speläotheme und Tropfwasser spielen eine wichtige Rolle bei der Erforschung von Naturerscheinungen. Während sich bisherige Studien größtenteils auf anorganische Proxies konzentrieren, wächst das Interesse an organischen Biomarkern, vor allem Lipidbiomarkern.rnAufgrund dessen wurde eine neue Methode entwickelt, um Fettsäuren mit einer geradzahligen Kettenlänge C12-C20 in Speläothem- und Tropfwasserproben zu bestimmen. Dabei kam eine Festphasenextraktion mit anschließender HPLC-MS Messung zum Einsatz. Die Methode wurde auf mehrere Proben aus der Herbstlabyrinth-Adventhöhle angewendet. Die benötigte Probenmenge wurde im Vergleich zu früheren Studien von etwa 4 L auf 60-100 mL bei Tropfwasser und von 10-100 g auf ca. 0,5-3,5 g bei Stalagmitproben reduziert. Es konnte eine interne Variation der Analyten festgestellt werden. Korrelationen der Fettsäuren ließen vermuten, dass C12-C18 von der gleichen Quelle stammen, während C20 teilweise andere Quellen besitzt. Korrelationen mit δ13C verdeutlichten, dass es einen Zusammenhang zwischen der Vegetationsdichte und dem Vorkommen der Fett-säuren in dem Probenmaterial gibt. Vergleiche mit Mg Konzentrationen zeigten, dass die Niederschlagsmenge zwar den Transport der Fettsäuren mit dem Tropfwasser beeinflusst, allerdings nicht ihre ermittelte Konzentration in Stalagmitproben. Die ermittelten Ergebnisse ließen darauf schließen, dass eindeutigere Vegetations-Proxies als die Fettsäuren gefunden werden müssen. Ein erstmaliges non-target screening verdeutlichte, dass Lignine und Tannine als charakteristische Biomarker eingesetzt werden können.rn
Resumo:
Die Gesundheitseffekte von Aerosolpartikeln werden stark von ihren chemischen und physikalischen Eigenschaften und somit den jeweiligen Bildungsprozessen und Quellencharakteristika beeinflusst. Während die Hauptquellen der anthropogenen Partikelemissionen gut untersucht sind, stellen die spezifischen Emissionsmuster zahlreicher kleiner Aerosolquellen, welche lokal und temporär zu einer signifikanten Verschlechterung der Luftqualität beitragen können, ein Forschungsdesiderat dar.rnIn der vorliegenden Arbeit werden in kombinierten Labor- und Feldmessungen durch ein integratives Analysekonzept mittels online (HR-ToF-AMS ) und filterbasierter offline (ATR-FTIR-Spektroskopie ) Messverfahren die weitgehend unbekannten physikalischen und chemischen Eigenschaften der Emissionen besonderer anthropogener Aerosolquellen untersucht. Neben einem Fußballstadion als komplexe Mischung verschiedener Aerosolquellen wie Frittieren und Grillen, Zigarettenrauchen und Pyrotechnik werden die Emissionen durch Feuerwerkskörper, landwirtschaftliche Intensivtierhaltung (Legehennen), Tief- und Straßenbauarbeiten sowie abwasserbürtige Aerosolpartikel in die Studie mit eingebunden. Die primären Partikelemissionen der untersuchten Quellen sind vorrangig durch kleine Partikelgrößen (dp < 1 µm) und somit eine hohe Lungengängigkeit gekennzeichnet. Dagegen zeigen die Aerosolpartikel im Stall der landwirtschaftlichen Intensivtierhaltung sowie die Emissionen durch die Tiefbauarbeiten einen hohen Masseanteil von Partikeln dp > 1 µm. Der Fokus der Untersuchung liegt auf der chemischen Charakterisierung der organischen Partikelbestandteile, welche für viele Quellen die NR-PM1-Emissionen dominieren. Dabei zeigen sich wichtige quellenspezifische Unterschiede in der Zusammensetzung der organischen Aerosolfraktion. Die beim Abbrand von pyrotechnischen Gegenständen freigesetzten sowie die abwasserbürtigen Aerosolpartikel enthalten dagegen hohe relative Gehalte anorganischer Substanzen. Auch können in einigen spezifischen Emissionen Metallverbindungen in den AMS-Massenspektren nachgewiesen werden. Über die Charakterisierung der Emissionsmuster und -dynamiken hinaus werden für einige verschiedenfarbige Rauchpatronen sowie die Emissionen im Stall der Intensivtierhaltung Emissionsfaktoren bestimmt, die zur quantitativen Bilanzierung herangezogen werden können. In einem weiteren Schritt werden anhand der empirischen Daten die analytischen Limitierungen der Aerosolmassenspektrometrie wie die Interferenz organischer Fragmentionen durch (Hydrogen-)Carbonate und mögliche Auswertestrategien zur Überwindung dieser Grenzen vorgestellt und diskutiert.rnEine umfangreiche Methodenentwicklung zur Verbesserung der analytischen Aussagekraft von organischen AMS-Massenspektren zeigt, dass für bestimmte Partikeltypen einzelne Fragmentionen in den AMS-Massenspektren signifikant mit ausgewählten funktionellen Molekülgruppen der FTIR-Absorptionsspektren korrelieren. Bedingt durch ihre fehlende Spezifität ist eine allgemeingültige Interpretation von AMS-Fragmentionen als Marker für verschiedene funktionelle Gruppen nicht zulässig und häufig nur durch die Ergebnisse der komplementären FTIR-Spektroskopie möglich. Des Weiteren wurde die Verdampfung und Ionisation ausgewählter Metallverbindungen im AMS analysiert. Die Arbeit verdeutlicht, dass eine qualitative und quantitative Auswertung dieser Substanzen nicht ohne Weiteres möglich ist. Die Gründe hierfür liegen in einer fehlenden Reproduzierbarkeit des Verdampfungs- und Ionisationsprozesses aufgrund von Matrixeffekten sowie der in Abhängigkeit vorangegangener Analysen (Verdampferhistorie) in der Ionisationskammer und auf dem Verdampfer statt-findenden chemischen Reaktionen.rnDie Erkenntnisse der Arbeit erlauben eine Priorisierung der untersuchten anthropogenen Quellen nach bestimmten Messparametern und stellen für deren Partikelemissionen den Ausgangpunkt einer Risikobewertung von atmosphärischen Folgeprozessen sowie potentiell negativen Auswirkungen auf die menschliche Gesundheit dar. rn
Resumo:
Tesi sullo studio di algoritmi per il confronto di documenti XML, panoramica sui vari algoritmi. Focalizzazione sull'algoritmo NDiff e in particolare sulla gestione degli attributi.
Resumo:
The 5th generation of mobile networking introduces the concept of “Network slicing”, the network will be “sliced” horizontally, each slice will be compliant with different requirements in terms of network parameters such as bandwidth, latency. This technology is built on logical instead of physical resources, relies on virtual network as main concept to retrieve a logical resource. The Network Function Virtualisation provides the concept of logical resources for a virtual network function, enabling the concept virtual network; it relies on the Software Defined Networking as main technology to realize the virtual network as resource, it also define the concept of virtual network infrastructure with all components needed to enable the network slicing requirements. SDN itself uses cloud computing technology to realize the virtual network infrastructure, NFV uses also the virtual computing resources to enable the deployment of virtual network function instead of having custom hardware and software for each network function. The key of network slicing is the differentiation of slice in terms of Quality of Services parameters, which relies on the possibility to enable QoS management in cloud computing environment. The QoS in cloud computing denotes level of performances, reliability and availability offered. QoS is fundamental for cloud users, who expect providers to deliver the advertised quality characteristics, and for cloud providers, who need to find the right tradeoff between QoS levels that has possible to offer and operational costs. While QoS properties has received constant attention before the advent of cloud computing, performance heterogeneity and resource isolation mechanisms of cloud platforms have significantly complicated QoS analysis and deploying, prediction, and assurance. This is prompting several researchers to investigate automated QoS management methods that can leverage the high programmability of hardware and software resources in the cloud.