home > projects > J3563 > index

Higher-Order Complexity Analysis of Rewrite Systems

We are increasingly relying on computer systems, whereas the level of sophistication of such systems steadily increased over the last decades. To produce robust systems, we require tools which support verification in an automated setting. Limited computational resources, such as space (i.e., memory) and time, render resource analysis a central topic in software verification

We aim to advance the field of static resource analysis, by devising new methods for the automated runtime complexity analysis of functional programs. Given a program, such methods should be able to deduce useful bounds on the runtime (i.e., number of computation steps) of the provided programs, as a function of the size of the inputs. To ensure that results are broadly applicable, one usually considers an abstract model of computation instead of a specific programming language. Term rewrite systems are particularly well suited as such an abstract language, and runtime complexity analysis is a very active research area in rewriting. However, the lack of facilities to faithfully model higher-order functions has prevented the use of rewrite systems for the analysis of functional programs written in languages like Haskell or OCaml, where the use of higher-order functions is paramount.

To overcome this regrettable situation, we base our studies on higher-order rewrite systems (HORSs for short), one of the most general formal models of functional programs. HORSs extend term rewrite systems to terms with abstraction and application. To date, almost no results are known on the resource analysis of HORSs. Nevertheless, we can draw inspiration from related areas of research, such as program analysis and the area of implicit computational complexity (ICC for short). Notably, the ICC community has a long history of producing characterisations of the class of polytime computable functions and related classes, based on specific instances of higher-order rewrite systems like the lambda calculus. We will seek to synthesise complexity analysis techniques for HORSs from such implicit characterisations. Although this synthesis is clearly feasible, the so obtained methods are rarely precise, and often only effective on a small class of systems. To obtain methods of practical relevance, we will have to calibrate the techniques for precision and effectiveness.

To shed light on expressiveness and to demonstrate the feasibility of the established techniques, we will develop prototypical implementations along the theoretical development. Building upon the lessons learned from those prototypes, we integrate the established techniques into the Tyrolean Complexity Tool (TCT for short). TCT is a highly modular tool for the automated complexity analysis of rewrite systems. To date, it implements a majority of the state-of-the-art techniques for the analysis of first-order rewrite systems. Our additions will allow TCT to fully automatically analyse the runtime complexity of functional programs, via abstractions to HORSs.

Funding

This three years project is supported by the FWF Austrian Science Funds, project number J-3563.

List of Publications

  • M. Avanzini and U. Dal Lago. “On Sharing, Memoization, and Polynomial Time”. Information and Computation 261, part 1:3–22, 2018. Elsevier. © Creative Commons License - ND.

  • M. Avanzini and U. Dal Lago. “Automating Size Type Inference and Complexity Analysis”. Proceedings of the 8th Workshop on Developments in Implicit Computational complExity and 5th Workshop on Foundational and Practical Aspects of Resource Analysis, 2017.

  • M. Avanzini and U. Dal Lago. “Automating Sized-Type Inference for Complexity Analysis”. Proceedings of the ACM on Programming Languages 1, 2017. Association for Computing Machinery. © Creative Commons License - ND.

  • M. Avanzini and M. Schaper. “GUBS Upper Bound Solver”. Proceedings of the 17th International Workshop on Developments in Implicit Complexity, 2017.

  • M. Avanzini and G. Moser. “Complexity of Acyclic Term Graph Rewriting”. Proceedings of the 1th International Conference on Formal Structures for Computation and Deduction, volume 52 of Leibnitz International Proceedings in Informatics, pages 10:1–10:18, 2016. Leibnitz Zentrum für Informatik. © Creative Commons License - ND.

  • M. Avanzini and G. Moser and M. Schaper. “TcT: Tyrolean Complexity Tool”. (Preprint). Proceedings of the 22nd International Conference on Tools and Algorithms for the Construction and Analysis of Systems, volume 9636 of Lecture Notes in Computer Science, pages 407–423, 2016. Springer Verlag Heidelberg. © Springer Verlag Heidelberg.

  • M. Avanzini and U. Dal Lago. “On Sharing, Memoization, and Polynomial Time”. Proceedings of the 32nd International Symposium on Theoretical Aspects of Computer Science, volume 30 of Leibnitz International Proceedings in Informatics, pages 62–75, 2015. Leibnitz Zentrum für Informatik. © Creative Commons License - ND.

  • M. Avanzini and U. Dal Lago and G. Moser. “Higher-Order Complexity Analysis: Harnessing First-Order Tools.”. Proceedings of the 6th International Workshop on Developments in Implicit Complexity, 2015.

  • M. Avanzini and U. Dal Lago and G. Moser. “Analysing the Complexity of Functional Programs: Higher-Order Meets First-Order”. Proceedings of the 20th ACM SIGPLAN International Conference on Functional Programming, pages 152–164, 2015. Association for Computing Machinery. © Association for Computing Machinery.

  • M. Avanzini and C. Sternagel and R. Thiemann. “Certification of Complexity Proofs using CeTA”. Proceedings of the 26th International Conference on Rewriting Techniques and Applications, volume 36 of Leibnitz International Proceedings in Informatics, pages 23–39, 2015. Leibnitz Zentrum für Informatik. © Creative Commons License - ND.

  • M. Avanzini and B. Felgenhauer. “Type Introduction for Runtime Complexity Analysis”. Proceedings of the 14th Workshop on Termination, 2014.