Process mining dramatically changed the way we look at process models and operational processes. Even seemingly simple processes like Purchase-to-Pay (P2P) and Order-to-Cash (O2C) are often amazingly complex, and traditional hand-made process models fail to capture the true fabric of such processes. Many processes are inherently concurrent and involve interaction between different objects (customers, suppliers, orders, items, shipments, payments, machines, workers, etc.). Process mining uses event data to construct process models that can be used to diagnose performance and compliance problems. If such models reflect reality well, they can be used for forward-looking forms of process mining, including predictive analytics, evidence-based automation, and what-if simulation. The ultimate goal is to create a ``digital twin'' that can be used to explore different improvement actions. In his keynote, Wil van der Aalst will provide an overview of process mining and the spectacular developments in the field. Currently, there are over 35 process mining companies (including Celonis with is the highest-valued startup in Germany) and most of the larger European firms have applied process mining. Despite the successes and mature tools, some of the foundational problems remain "unsolved" and will block future progress. Therefore, the keynote highlights the challenges related to concurrency and objects in process mining.
Prof.dr.ir. Wil van der Aalst is a full professor at RWTH Aachen University leading the Process and Data Science (PADS) group. He is also part-time affiliated with the Fraunhofer-Institut für Angewandte Informationstechnik (FIT) where he leads FIT's Process Mining group and the Technische Universiteit Eindhoven (TU/e). Until December 2017, he was the scientific director of the Data Science Center Eindhoven (DSC/e) and led the Architecture of Information Systems group at TU/e. Since 2003, he holds a part-time position at Queensland University of Technology (QUT). Currently, he is also a distinguished fellow of Fondazione Bruno Kessler (FBK) in Trento and a member of the Board of Governors of Tilburg University. His research interests include process mining, Petri nets, business process management, workflow management, process modeling, and process analysis. Wil van der Aalst has published over 250 journal papers, 22 books (as author or editor), 550 refereed conference/workshop publications, and 80 book chapters. Many of his papers are highly cited (he one of the most cited computer scientists in the world; according to Google Scholar, he has an H-index of 155 and has been cited over 115,000 times), and his ideas have influenced researchers, software developers, and standardization committees working on process support. Next to serving on the editorial boards of over ten scientific journals, he is also playing an advisory role for several companies, including Fluxicon, Celonis, aiConomix, and UiPath. Van der Aalst received honorary degrees from the Moscow Higher School of Economics (Prof. h.c.), Tsinghua University, and Hasselt University (Dr. h.c.). He is also an IFIP Fellow, IEEE Fellow, ACM Fellow, and an elected member of the Royal Netherlands Academy of Arts and Sciences, the Royal Holland Society of Sciences and Humanities, the Academy of Europe, and the North Rhine-Westphalian Academy of Sciences, Humanities and the Arts (Nordrhein-Westfälische Akademie der Wissenschaften und der Künste). In 2018, he was awarded an Alexander-von-Humboldt Professorship.
When I first read Hardy and Wright’s Number Theory I was captivated. However, as much as the mathematics itself, one statement always stood out for me.
In the very first chapter they list a number of “questions concerning primes”, the first of which is whether there is a formula for the nth prime.
Hardy and Wright explicitly say that this seems “unlikely” given the distribution of the series is “quite unlike what we should expect
on any such hypothesis.”
I think most number theorists would still agree with this assertion, indeed many cryptographic techniques would collapse if it such a formulae
Yet what is this sense that the structure of primes and the structure of formulae are so different?
It is not formal mathematics itself, else it would be a proof.
In engineering, computation, physics, indeed any quantitative or formal domain, the precise and provable sits alongside an informal grasp of the nature of the domain. This was certainly true in my own early and more recent work on formal modelling of human computer interaction: sometimes, as in the case of undo, one can make exact and faithful statements and proofs, but more often in order to achieve formal precision, one resorts to simplistic representations of real-life. However, despite their gap from the true phenomena, these modes, however lacking in fidelity, still give us insight.
I’m sure this will be familiar to those working in other areas where theoretical models are applied to practical problems. There is a quantum-like tension between the complexity of the world and our ability to represent it, between accuracy and precision, between fidelity and formality. Yet, we do learn about real phenomena from these simplified models, and in many contexts, from primary school estimation to scientific research we use these forms of thinking – I call this qualitative–quantitative reasoning.
This has become particularly important during Covid, when both simple formulae and massive supercomputing models offer precise predictions of the impact of specific interventions. However, even the most complex model embodies simplifications and it is when the different models lead to qualitatively similar behaviours that they are most trusted. Similar issues arise for climate change, international economics and supermarket shopping.
Qualitative–quantitative reasoning is ubiquitous, but not often discussed – almost a dirty secret for the formalist and yet what makes theory practical. There are lessons for science and for schools, challenges for visualisation and argumentation. I don’t know all of the answers, but by bringing this to the surface I know there are exciting questions.
Professor Alan Dix is Director of the Computational Foundry at Swansea University, a £31m investment by the University, Welsh Government and
European Union to create a place to nurture fundamental digital research that makes a difference to real lives.
Previously he spent 10 years in a mix of academic and commercial roles.
Alan has worked in human–computer interaction research since the mid 1980s and is the author of one of the major international textbooks on HCI as well as of over 500 research publications from formal methods to design creativity. Last year his book “Statistics for HCI: Making Sense of Quantitative Data” has been published by Morgan & Claypool, and his new online course “Creativity: Methods to Design Better Products and Services” was launched at interaction-design.org. His latest book, “TouchIT: Understanding Design in a Physical–Digital World” is in press with OUP and he also has a second edition of a textbook “An Introduction to AI” and a shorter “AI for Social Justice” under preparation.
Although he is known best for his research in HCI, Alan’s early academic background was in pure mathematics, where he was a member of the British team to the 1978 International Mathematical Olympiad in Bucharest, and also statistics completing a Diploma in Mathematical Statistics at DPMMS in Cambridge after finishing his BA in Mathematics a year early. He has worked in mathematical modelling, and aspects of applied AI including using genetic algorithms in early submarine design and co-directing a dot-com era start-up on intelligent internet interfaces. In 1992 he wrote possibly the first paper to highlight the potential for gender and ethnic bias in black-box machine learning algorithms and is now leading the algorithmic social justice theme of the UK Not-Equal network for Digital Economy and Social Justice. He is also a CoI on the Swansea Centre for Doctoral Training “Enhancing Human Interactions and Collaborations with Data and Intelligence Driven Systems”, a major initiative taking a people-first approach to AI and big data and CoI on “InterMusE – The Internet of Musical Events: Digital Scholarship, Community and the Archiving of Performance” a trans-Atlantic collaboration to democratise digitisation.
In the talk I will offer a detailed presentation on how we make the symbolic model checking techniques
and machine learning join forces in the tool UPPAAL.
The first step towards exploiting the efficiency of machine learning in UPPAAL was made in the branch Uppaal SMC. Here highly efficient statistical model checking capabilities are provided allowing for performance analysis of a rich class of stochastic hybrid automata. Most recently the SMC engine of UPPAAL has been considerably accelerated by exploiting independencies of system components during generation of random runs. This has allowed us to make Agent-based models of the COVID19 pandemic with millions of components in order to analyze the effect of different lock-down mechanism considered in Denmark.
In the most recent branch UPPAAL Stratego symbolic techniques are combined with reinforcement learning to effientciently obtain near-optimal yet safe strategies for hybrid Markov decision processes. The talk will describe learning algorithms developed, their convergence, compact representation as well as demonstration of the tool on a Dagstuhl “Smart Farming Challenge”. Other applications of UPPAAL Stratego include water-management, traffic-light control, and energy-aware building.
Kim Guldstrand Larsen is a Professor of Computer Science at Aalborg University (AAU), since 1993.
He received Honorary Doctorate from Uppsala University (1999), ENS Cachan (2007), International Chair at INRIA (2016)
and Distinguished Professor at North-Eastern University, Shenyang, China (2018).
His research interests cover modeling, verification, performance analysis of real-time and embedded systems with applications
to concurrency theory, model checking and machine learning.
He is the prime investigator of the verification tool UPPAAL for which he received the CAV Award in 2013. Other prized received include Danish Citation Laureates Award, Thomson Scientific Award as the most cited Danish Computer Scientist in the period 1990-2004 (2005), Grundfos Prize (2016), Ridder af Dannebrog (2007). He is member of the Royal Danish Academy of Sciences and Letters, The Danish Academy of Technical Science, where he is Digital wiseman. Also, he is member of the Academia Europaea. In 2015 he received the prestigious ERC Advanced Grant (LASSO), and in 2021 he won Villum Investigator Grant (S4OS). He has been PI and director of several large centers and initiatives including CISS (Center for Embedded Software systems, 2002-2008), MT-LAB (Villum-Kahn Rasmussen Center of Excellence,n2009-2013), IDEA4CPS (Danish-Chinese Research Center, 2011-2017), INFINIT National ICT Innovation Network, 2009-2020), DiCyPS (Innovation Fund Center, 2015-2021). Finally, he is co-founder of the companies UP4ALL (2000), ATS (2017) and VeriAal (2020).
The usual post-mortem approach to formal language semantics and verification, where the language is firstly implemented and used in production for many years before a need for formal semantics and verification tools naturally arises, simply does not work anymore. New blockchain languages or virtual machines are proposed at an alarming rate, followed by new versions of them every few weeks, together with programs (or smart contracts) in these languages that are responsible for financial transactions of potentially significant value. Formal analysis and verification tools are therefore needed immediately for such languages and virtual machines. We will present recent academic and commercial results in developing blockchain languages and virtual machines that come directly equipped with formal analysis and verification tools. The main idea is to generate all these automatically, correct-by-construction from a formal language specification.
Grigore Rosu is a professor in the Department of Computer Science at the University of Illinois at Urbana-Champaign (UIUC), where he leads the Formal Systems Laboratory (FSL), and the founder and president of Runtime Verification, Inc (RV). His research interests encompass both theoretical foundations and system development in the areas of formal methods, software engineering and programming languages. Before joining UIUC in 2002, he was a research scientist at NASA Ames. He obtained his Ph.D. at the University of California at San Diego in 2000. He was offered the CAREER award by the NSF, the Dean’s award for excellence in research by the College of Engineering at UIUC in 2014, and the outstanding junior award by the Computer Science Department at UIUC in 2005. He won the ASE IEEE/ACM most influential paper award in 2016 (for an ASE 2001 paper that helped shape the runtime verification field), the ACM SIGSOFT distinguished paper awards at ASE 2008, ASE 2016, and OOPSLA 2016, and the best software science paper award at ETAPS 2002.