8 resultados para probabilistic refinement calculus

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tässä diplomityössä tehtiin Olkiluodon ydinvoimalaitoksella sijaitsevan käytetyn ydinpolttoaineen allasvarastointiin perustuvan välivaraston todennäköisyysperustainen ulkoisten uhkien riskianalyysi. Todennäköisyysperustainen riskianalyysi (PRA) on yleisesti käytetty riskien tunnistus- ja lähestymistapa ydinvoimalaitoksella. Työn tarkoituksena oli laatia täysin uusi ulkoisten uhkien PRA-analyysi, koska Suomessa ei ole aiemmin tehty vastaavanlaisia tämän tutkimusalueen riskitarkasteluja. Riskitarkastelun motiivina ovat myös maailmalla tapahtuneiden luonnonkatastrofien vuoksi korostunut ulkoisten uhkien rooli käytetyn ydinpolttoaineen välivarastoinnin turvallisuudessa. PRA analyysin rakenne pohjautui tutkimuksen alussa luotuun metodologiaan. Analyysi perustuu mahdollisten ulkoisten uhkien tunnistamiseen pois lukien ihmisen aikaansaamat tahalliset vahingot. Tunnistettujen ulkoisten uhkien esiintymistaajuuksien ja vahingoittamispotentiaalin perusteella ulkoiset uhat joko karsittiin pois tutkimuksessa määriteltyjen karsintakriteerien avulla tai analysoitiin tarkemmin. Tutkimustulosten perusteella voitiin todeta, että tiedot hyvin harvoin tapahtuvista ulkoisista uhista ovat epätäydellisiä. Suurinta osaa näistä hyvin harvoin tapahtuvista ulkoisista uhista ei ole koskaan esiintynyt eikä todennäköisesti koskaan tule esiintymään Olkiluodon vaikutusalueella tai edes Suomessa. Esimerkiksi salaman iskujen ja öljyaltistuksen roolit ja vaikutukset erilaisten komponenttien käytettävyyteen ovat epävarmasti tunnettuja. Tutkimuksen tuloksia voidaan pitää kokonaisuudessaan merkittävinä, koska niiden perusteella voidaan osoittaa ne ulkoiset uhat, joiden vaikutuksia olisi syytä tutkia tarkemmin. Yksityiskohtaisempi tietoisuus hyvin harvoin esiintyvistä ulkoisista uhista tarkentaisi alkutapahtumataajuuksien estimaatteja.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeller för intermolekulär växelvärkan utnyttjas brett inom biologin. Analys av kontakter mellan proteiner och läkemedelsforskning representerar typiska tillämpningsområden för dylika modeller. En modell som beskriver sådana molekylära växelverkningar kan utformas med hjälp av biofysisk teori, vilket tenderar att resultera i ytterst tung beräkningsbörda även för enkla tillämpningar. Ett alternativt sätt att formulera modeller är att utnyttja stora databaser som innehåller strukturmätningar gjorda med hjälp av till exempel röntgendiffraktion. Då man använder sig av empiriska mätdata direkt, möjliggör en statistisk modell att osäkerheten och inexaktheten i datat tas till hänsyn på ett adekvat sätt, samtidigt som beräkningsbördan håller sig på en rimligare nivå jämfört med kvantmekaniska metoder som i princip borde ge de optimala resultaten. I avhandlingen utvecklades en 3D modell för numerisk undersökning av intermolekulär växelverkan baserad på Bayesiansk statistik. Modellens syfte är att åstadkomma prognoser för det hurdana eller vilka molekylstrukturer prefereras i en given kontext, d.v.s. är mer sannolika inom ramen för interaktion. Modellen testades i essentiella molekyläromgivningar - en liten molekyl vid sin bindningsplats hos ett protein och en gränsyta mellan proteinerna i ett komplex. De erhållna numeriska resultaten motsvarar väl experimentella resultat som tidigare rapporterats i litteraturen, exempelvis kvalitativa bindningsaffiniteter och kemisk kännedom av vissa aminosyrors rumsliga förmågor att utgöra bindningar. I avhandlingen gjordes ytterligare preliminära tester av den statistiska ansatsen för modellering av den centrala molekylära strukturella anpassningsbarheten. I praktiken är den utvecklade modellen ämnad som ett led i en mer omfattande analysmetod, så som en s.k. farmakofor modell. Molekyylivuorovaikutusten mallintamista hyödynnetään laajasti biologisten kysymysten tarkastelussa. Tyypillisiä esimerkkejä sovelluskohteista ovat proteiinien väliset kontaktit ja lääkesuunnittelu. Vuorovaikutuksia kuvaavan mallin lähtökohta voi olla molekyyleihin liittyvä teoria, jolloin soveltamiseen liittyvä laskenta saattaa olla erityisen raskasta, tai suuri havaintojoukko joka on saatu aikaan esimerkiksi mittaamalla rakenteita röntgendiffraktio menetelmällä. Tilastollinen malli mahdollistaa havaintoaineistossa olevan epätarkkuuden ja epävarmuuden huomioimisen, samalla pitäen laskennallisen kuorman pienempänä verrattuna periaatteessa parhaan tuloksen antavaan kvanttimekaaniseen mallinnukseen. Väitöstyössä kehitettiin bayesiläiseen tilastotieteeseen perustuva 3D malli molekyylien välisten vuorovaikutusten laskennalliseen tarkasteluun. Mallin tehtävä on tuottaa ennusteita sen suhteen, minkä tai millaisten molekyylirakenteiden väliset kompleksit ovat etusijalla, toisin sanoen todennäköisempiä, vuorovaikutustilanteessa. Työssä kehitetyn menetelmän toimivuutta testattiin käyttötarkoituksen suhteen olennaisissa molekyyliympäristöissä - pieni molekyyli sitoutumiskohdassaan proteiinissa sekä rajapinta kahden proteiinin välilllä proteiinikompleksissa. Saadut laskennalliset tulokset vastasivat hyvin vertailuun käytettyjä kirjallisuudesta saatuja kokeellisia tuloksia, kuten laadullisia sitoutumisaffiniteetteja, sekä kemiallista tietoa esimerkiksi tiettyjen aminohappojen avaruudellisesta sidoksenmuodostuksesta. Väitöstyössä myös alustavasti testattiin tilastollista lähestymistapaa tärkeän molekyylien rakenteellisen mukautuvuuden mallintamiseen. Käytännössä malli on tarkoitettu osaksi jotakin laajempaa analyysimenetelmää, kuten farmakoforimallia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent emergence of low-cost RGB-D sensors has brought new opportunities for robotics by providing affordable devices that can provide synchronized images with both color and depth information. In this thesis, recent work on pose estimation utilizing RGBD sensors is reviewed. Also, a pose recognition system for rigid objects using RGB-D data is implemented. The implementation uses half-edge primitives extracted from the RGB-D images for pose estimation. The system is based on the probabilistic object representation framework by Detry et al., which utilizes Nonparametric Belief Propagation for pose inference. Experiments are performed on household objects to evaluate the performance and robustness of the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building a computational model for complex biological systems is an iterative process. It starts from an abstraction of the process and then incorporates more details regarding the specific biochemical reactions which results in the change of the model fit. Meanwhile, the model’s numerical properties such as its numerical fit and validation should be preserved. However, refitting the model after each refinement iteration is computationally expensive resource-wise. There is an alternative approach which ensures the model fit preservation without the need to refit the model after each refinement iteration. And this approach is known as quantitative model refinement. The aim of this thesis is to develop and implement a tool called ModelRef which does the quantitative model refinement automatically. It is both implemented as a stand-alone Java application and as one of Anduril framework components. ModelRef performs data refinement of a model and generates the results in two different well known formats (SBML and CPS formats). The development of this tool successfully reduces the time and resource needed and the errors generated as well by traditional reiteration of the whole model to perform the fitting procedure.