Statistical Model Checking with Robust Markov Decision Processes
Publikation: Hochschulschrift/Abschlussarbeit › Dissertation
Beitragende
Abstract
Autonomous systems in safety-critical domains require rigorous verification of correctness, safety, and reliability. Traditional model checking techniques solve this problem by formally proving that a given system model adheres to a formal specification. However, this in turn requires that an exact representation of the system as a formal model is available. In practice, this is often unrealistic, e.g. for cyber-physical systems that exhibit probabilistic behavior due to interaction with an unknown environment. This is known as the validation problem. In this thesis we explore statistical model checking (SMC) as a unified approach to tackle the validation and verification problem simultaneously for system without a known formal model. By sampling directly from the real underlying system, we construct a confidence region of models which contains a model corresponding to the true system with high probability. Then, by performing robust optimization over this confidence region, we obtain probably approximately correct (PAC) bounds on the objective of interest. Since samples have to directly be generated from the system in this setting, we put an emphasis on the sample-efficiency of the SMC process. In this thesis we focus on probabilistic system with non-deterministic choices modeled by Markov decision processes (MDP), for which a confidence region is given in the form of a robust MDP (RMDP). For this task, we introduce a compositional framework for model-based SMC with PAC guarantees, consisting of three sub-problems: sampling the MDP, building the RMDP, and solving the RMDP. We further investigate each of these aspects in detail for settings in which the objective is to maximize or minimize the expected reward or cost in a system. For the problem of sampling the MDP, we develop heuristics that are inspired by observations in human behavior. This is motivated by the observation that in many areas, humans are able to learn tasks with much less experience than current machine learning approaches require. For building an RMDP from the gathered sample data, we develop data-efficient methods that build a confidence region of plausible MDPs, improving upon previous work by employing statistical methods that allow for smaller confidence regions. In particular, we emphasize the necessity for sound estimation methods that always guarantee that the true system model lies within the confidence with a prescribed probability. Finally, towards solving the built RMDP, we first perform a thorough analysis on the theoretical aspects of RMDP and identify sufficient conditions for memoryless deterministic (MD) determinacy and the existence of optimal MD policies. We then develop preprocessing steps that ensure these properties hold, and efficient value-iteration-based approaches with a focus on convergence and sound termination conditions. Thus, this work advances the practicality of model-based SMC by unifying model acquisition and robust optimization, delivering PAC guarantees while minimizing costly system sampling.
| Titel in Übersetzung | Statistisches Model Checking mit Robusten Markow-Entscheidungsprozessen |
|---|
Details
| Originalsprache | Englisch |
|---|---|
| Qualifizierungsstufe | Dr.-Ing. |
| Gradverleihende Hochschule | |
| Betreuer:in / Berater:in |
|
| Datum der Verteidigung (Datum der Urkunde) | 13 Okt. 2025 |
| Publikationsstatus | Veröffentlicht - 4 Nov. 2025 |
No renderer: customAssociatesEventsRenderPortal,dk.atira.pure.api.shared.model.researchoutput.Thesis
Schlagworte
Schlagwörter
- model checking, verification, statistics, Markov Decision Process