953 resultados para Models and Principles
Resumo:
Natural events are a widely recognized hazard for industrial sites where relevant quantities of hazardous substances are handled, due to the possible generation of cascading events resulting in severe technological accidents (Natech scenarios). Natural events may damage storage and process equipment containing hazardous substances, that may be released leading to major accident scenarios called Natech events. The need to assess the risk associated with Natech scenarios is growing and methodologies were developed to allow the quantification of Natech risk, considering both point sources and linear sources as pipelines. A key element of these procedures is the use of vulnerability models providing an estimation of the damage probability of equipment or pipeline segment as a result of the impact of the natural event. Therefore, the first aim of the PhD project was to outline the state of the art of vulnerability models for equipment and pipelines subject to natural events such as floods, earthquakes, and wind. Moreover, the present PhD project also aimed at the development of new vulnerability models in order to fill some gaps in literature. In particular, a vulnerability model for vertical equipment subject to wind and to flood were developed. Finally, in order to improve the calculation of Natech risk for linear sources an original methodology was developed for Natech quantitative risk assessment methodology for pipelines subject to earthquakes. Overall, the results obtained are a step forward in the quantitative risk assessment of Natech accidents. The tools developed open the way to the inclusion of new equipment in the analysis of Natech events, and the methodology for the assessment of linear risk sources as pipelines provides an important tool for a more accurate and comprehensive assessment of Natech risk.
Resumo:
Questa tesi intende approfondire da un punto di vista, sia teorico sia computazionale, le proprietà fondamentali dei fononi. A tal fine, sono presentati i modelli quantistici di Einstein e di Debye che permettono la derivazione analitica degli osservabili macroscopici principali di un solido, come l’energia media e la capacità termica. Ciò è possibile tramite una trattazione meccano-statistica basata sull’approssimazione armonica dei modi normali di vibrazione degli ioni reticolari. Quindi, all’inizio si mostrano brevemente i risultati principali riguardanti l’oscillatore armonico quantistico. Successivamente, si approfondiscono i temi della dispersione fononica e della densità degli stati vibrazionali per reticoli cristallini 1D e 3D. Si ottiene che la prima non può essere considerata lineare se non nel limite di alte lunghezze d’onda, e che la seconda può presentare punti di singolarità correlati alla forma della relazione di dispersione. Infine, sono state svolte alcune analisi computazionali ab initio relative alla dispersione fononica, la densità degli stati vibrazionali e la frequenza di Debye del Carbonio (diamante) tramite i programmi VASP e Phonopy, confrontando i risultati con dati sperimentali presenti in letteratura.
Resumo:
A proof of concept for a wearable device is presented to help patients who suffer from panic attacks due to panic disorder. The aim of this device is to enable such patients manage these stressful episodes by guiding them to regulate their breathing and by informing the care taker. Panic attack prediction is deployed that can enable the healthcare providers to not only monitor and manage the panic attacks of a patient but also carry out an early intervention to reduce the symptom severity of the approaching panic attack. The patient can acquire the help they need, ultimately regaining control. The concept of panic attack prediction can lead to a personalized treatment of the patient. The study is conducted using a small real-world dataset, and only two primary symptoms of panic attack are used. These symptoms include pacing heart rate and hyperventilation or abnormal breathing rate. This thesis project is developed in collaboration with ALTEN italia and all the required hardware is provided by them.
Resumo:
Nowadays the idea of injecting world or domain-specific structured knowledge into pre-trained language models (PLMs) is becoming an increasingly popular approach for solving problems such as biases, hallucinations, huge architectural sizes, and explainability lack—critical for real-world natural language processing applications in sensitive fields like bioinformatics. One recent work that has garnered much attention in Neuro-symbolic AI is QA-GNN, an end-to-end model for multiple-choice open-domain question answering (MCOQA) tasks via interpretable text-graph reasoning. Unlike previous publications, QA-GNN mutually informs PLMs and graph neural networks (GNNs) on top of relevant facts retrieved from knowledge graphs (KGs). However, taking a more holistic view, existing PLM+KG contributions mainly consider commonsense benchmarks and ignore or shallowly analyze performances on biomedical datasets. This thesis start from a propose of a deep investigation of QA-GNN for biomedicine, comparing existing or brand-new PLMs, KGs, edge-aware GNNs, preprocessing techniques, and initialization strategies. By combining the insights emerged in DISI's research, we introduce Bio-QA-GNN that include a KG. Working with this part has led to an improvement in state-of-the-art of MCOQA model on biomedical/clinical text, largely outperforming the original one (+3.63\% accuracy on MedQA). Our findings also contribute to a better understanding of the explanation degree allowed by joint text-graph reasoning architectures and their effectiveness on different medical subjects and reasoning types. Codes, models, datasets, and demos to reproduce the results are freely available at: \url{https://github.com/disi-unibo-nlp/bio-qagnn}.
Resumo:
Työssä oli tavoitteena suunnitella globaali sovellusarkkitehtuuri, joka ohjaa teollisen huoltoyrityksen sovellusten kehitystyötä. Sovellusarkkitehtuuri kuvaa tietokoneohjelmien toiminnallisuuteen loppukäyttäjien näkökulmasta ja sen laatiminen on osa strategista tietojärjestelmäsuunnittelua. Arkkitehtuurin tehtävänä on varmistaa, että tietojärjestelmät suunnitellaan kokonaisuutena tukemaan organisaation toimintaa. Arkkitehtuurin tekemistä ohjasi strategisen tietojärjestelmäsuunnittelun periatteet ja mallit. Tekniikat olivat samoja kuin projektikohtaisessa tietojärjestelmäsuunnittelussa. Sovellusarkkitehtuurin tekeminen alkoi tutustumalla yrityksessä vallitsevaan tilanteeseen sekä liiketoiminta- ja tietotekniikkastrategioihin. Tarkastelun kohteena olivat pääasiassa liiketoimintaprosessit ja käytössä olevat sovellukset. Tutustuminen tapahtui lähinnä haastatteluin ja dokumentteihin tutustumalla. Seuraavaksi johdettiin vaatimukset tulevaisuuden sovelluksille haastatteluista ja edellisen vaiheen materiaalin perusteella. Liiketoiminnan kannalta tärkeimmät vaatimukset valittiin täytettäväksi arkkitehtuurilla. Varsinaisen arkitehtuurin tekeminen oli lähinnä sovellusten valitsemisesta ja niiden keskinäisten suhteiden määrittelyä. Arkkitehtuurin perusteella määritettiin kehityshankeet.
Resumo:
Tässä diplomityössä tarkastellaan teollisuusyrityksen suorituskykyä asiakkaiden näkökulmasta. Työn tavoitteena on rakentaa kohdeyritykselle yrityksen suorituskykyä asiakkaan näkökulmasta kuvaava mittaristo. Työn kirjallisuustutkimuksen perusteella tunnistetaan olennaiset menetelmät asiakaspalvelutarpeiden mittaamiseen ja analysointiin. Lisäksi työssä esitellään suorituskyvyn mittariston rakentamisen mallit ja periaatteet, sekä mittariston implementointiin vaikuttavat tekijät. Empiirisessä osassa toteutetaan avainasiakkaiden palvelutarpeiden ja yrityksen suoritustason kartoitus ja analysointi. Kohdeyritykselle rakennetaan mittariston asiakaspalvelutarpeiden perusteella, soveltuen yrityksen toimintaympäristöön ja strategisiin päämääriin. Mittaristo rakennetaa Balanced Scorecardin asiakasnäkökulman asiakaslupausmittareiden pohjalle. Työn tuloksena kohdeyritys saa kuvan palvelutasostaan. Työssä esitetyn asiakaslupausmittariston avulla kohdeyritys voi seurata ja kehittää suorituskykyään. Lisäksi työssä tuodaan ilmi tärkeitä tekijöitä, jotka vaikuttavat mittariston implementointiin ja esitetään suositusten muodossa toimintaperiaatteita implementointiprosessissa.
Resumo:
In this paper the issues of Ukrainian new three-level pension system are discussed. First, the paper presents the mathematical model that allows calculating the optimal size of contributions to the non-state pension fund. Next, the non-state pension fund chooses an Asset Management Company. To do so it is proposed to use an approach based on Kohonen networks to classify asset management companies that work in Ukrainian market. Further, when the asset management company is chosen, it receives the pension contributions of the participants of the non-pension fund. Asset Management Company has to invest these contributions profitably. This paper proposes an approach for choosing the most profitable investment project using decision trees. The new pension system has been lawfully ratified only four years ago and is still developing, that is why this paper is very important.
Resumo:
Sustainable development support, balanced scorecard development and business process modeling are viewed from the position of systemology. Extensional, intentional and potential properties of a system are considered as necessary to satisfy functional requirements of a meta-system. The correspondence between extensional, intentional and potential properties of a system and sustainable, unsustainable, crisis and catastrophic states of a system is determined. The inaccessibility cause of the system mission is uncovered. The correspondence between extensional, intentional and potential properties of a system and balanced scorecard perspectives is showed. The IDEF0 function modeling method is checked against balanced scorecard perspectives. The correspondence between balanced scorecard perspectives and IDEF0 notations is considered.
Resumo:
This paper highlights the challenges of satellite monitoring systems integration, in particular based on Grid platform, and reviews possible solutions for these problems. We describe integration issues on different levels: data integration level and task management level (job submission in terms of Grid). We show example of described technologies for integration of monitoring systems of Ukraine (National Space Agency of Ukraine, NASU) and Russia (Space Research Institute RAS, IKI RAN). Another example refers to the development of InterGrid infrastructure that integrates several regional and national Grid systems: Ukrainian Academician Grid (with Satellite data processing Grid segment) and RSGS Grid (Chinese Academy of Sciences).
Resumo:
Определение многокритериального решения по своей природе компромиссно и принципиально основано на использовании субъективной информации. Возможность решения проблемы основана на гипотезе существования некоторой функции полезности. Традиционный подход линеаризации функции полезности обладает многими недостатками. Предлагается концепция нелинейной схемы компромиссов.
Resumo:
Cancer is a major cause of morbidity and mortality worldwide, with a disease burden estimated to increase in the coming decades. Disease heterogeneity and limited information on cancer biology and disease mechanisms are aspects that 2D cell cultures fail to address. We review the current "state-of-the-art" in 3D Tissue Engineering (TE) models developed for and used in cancer research. Scaffold-based TE models and microfluidics, are assessed for their potential to fill the gap between 2D models and clinical application. Recent advances in combining the principles of 3D TE models and microfluidics are discussed, with a special focus on biomaterials and the most promising chip-based 3D models.
Resumo:
Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.
Resumo:
Once the preserve of university academics and research laboratories with high-powered and expensive computers, the power of sophisticated mathematical fire models has now arrived on the desk top of the fire safety engineer. It is a revolution made possible by parallel advances in PC technology and fire modelling software. But while the tools have proliferated, there has not been a corresponding transfer of knowledge and understanding of the discipline from expert to general user. It is a serious shortfall of which the lack of suitable engineering courses dealing with the subject is symptomatic, if not the cause. The computational vehicles to run the models and an understanding of fire dynamics are not enough to exploit these sophisticated tools. Too often, they become 'black boxes' producing magic answers in exciting three-dimensional colour graphics and client-satisfying 'virtual reality' imagery. As well as a fundamental understanding of the physics and chemistry of fire, the fire safety engineer must have at least a rudimentary understanding of the theoretical basis supporting fire models to appreciate their limitations and capabilities. The five day short course, "Principles and Practice of Fire Modelling" run by the University of Greenwich attempt to bridge the divide between the expert and the general user, providing them with the expertise they need to understand the results of mathematical fire modelling. The course and associated text book, "Mathematical Modelling of Fire Phenomena" are aimed at students and professionals with a wide and varied background, they offer a friendly guide through the unfamiliar terrain of mathematical modelling. These concepts and techniques are introduced and demonstrated in seminars. Those attending also gain experience in using the methods during "hands-on" tutorial and workshop sessions. On completion of this short course, those participating should: - be familiar with the concept of zone and field modelling; - be familiar with zone and field model assumptions; - have an understanding of the capabilities and limitations of modelling software packages for zone and field modelling; - be able to select and use the most appropriate mathematical software and demonstrate their use in compartment fire applications; and - be able to interpret model predictions. The result is that the fire safety engineer is empowered to realise the full value of mathematical models to help in the prediction of fire development, and to determine the consequences of fire under a variety of conditions. This in turn enables him or her to design and implement safety measures which can potentially control, or at the very least reduce the impact of fire.
Resumo:
Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.
The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.
The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).
The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.
The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.
In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física