Graham Hutton. In preparation, 2017.
For more than twenty years now, we have been using the language of integers and addition as a minimal setting in which to explore different aspects of programming language semantics. However, this minimal language, which is sometimes called Hutton's Razor, has never been the subject of an article in its own right. This paper fills this gap, by showing how a range of semantic approaches and new research developments can be explained in a simple manner through the lens of integers and addition. In this setting, it's easy as 1,2,3.
Martin Handley and Graham Hutton. In preparation, 2017.
Lazy evaluation is a key feature of Haskell, but can make it difficult to reason about the efficiency of programs. Improvement theory addresses this problem by providing a foundation for proofs of program improvement in a call-by-need setting, and has recently been the subject of renewed interest. However, proofs of improvement are intricate and require an inequational style of reasoning that is unfamiliar to most Haskell programmers. In this article, we present the design and implementation of an inequational reasoning assistant that provides mechanical support for improvement proofs, and demonstrate its utility by verifying a range of improvement results from the literature.
Jennifer Hackett and Graham Hutton. In preparation, 2017.
Parametricity, in both operational and denotational forms, has long been a useful tool for proving correctness properties. So-called "free theorems" have been widely used to prove the correctness of both programs and program transformations. However, while parametricity can help when reasoning about correctness properties, there is as yet no comparable technique for reasoning about program improvement, i.e. when one program uses no more resources than another. Existing theories of parametricity cannot be used to address this problem as they are agnostic with regard to resource usage. This article addresses this problem by presenting a new operational theory of parametricity that is sensitive to time costs, which can be used to prove time improvement properties. We demonstrate the applicability of our theory by showing how it can be used to prove efficiency properties of parametricity-based program transformations, including the well-known foldr/build technique.
Graham Hutton and Patrick Bahr. Submitted to the Journal of Functional Programming, January 2017.
Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example in a modern context, and show how such a compiler can now be calculated directly from a specification of its correctness using simple equational reasoning techniques.
Jonathan Fowler and Graham Hutton. Proceedings of the 19th International Symposium on Practical Aspects of Declarative Languages, Paris, France, January 2017.
In property-based testing, a key problem is generating input data that satisfies the precondition of a property. One approach is to attempt to do so automatically, from the definition of the precondition itself. This idea has been realised using the technique of needed narrowing, as in the Lazy SmallCheck system, however in practice this method often leads to excessive backtracking resulting in poor efficiency. To reduce the amount of backtracking, we develop an extension to needed narrowing that allows preconditions to fail faster based on the use of overlapping patterns. We formalise our extension, show how it can be implemented, and demonstrate that it improves efficiency in many cases.
Graham Hutton. Cambridge University Press, September 2016.
Haskell is a purely functional language that allows programmers to rapidly develop clear, concise, and correct software. The language has grown in popularity in recent years, both in teaching and in industry. This book is based on the author's experience of teaching Haskell for more than twenty years. All concepts are explained from first principles and no programming experience is required, making this book accessible to a broad spectrum of readers. While Part I focuses on basic concepts, Part II introduces the reader to more advanced topics. This new edition has been extensively updated and expanded to include recent and more advanced features of Haskell, new examples and exercises, selected solutions, and freely downloadable lecture slides and example code. The presentation is clean and simple, while also being fully compliant with the latest version of the language, including recent changes concerning applicative, monadic, foldable, and traversable types.
Venanzio Capretta, Graham Hutton and Mauro Jaskelioff. To appear in the volume of selected papers from the Symposium on Implementation and Application of Functional Languages, Leuven, Belgium, August-September 2016.
Coinductive data structures, such as streams or infinite trees, have many applications in functional programming and type theory, and are naturally defined using recursive equations. But how do we ensure that such equations make sense, i.e. that they actually generate a productive infinite object? A standard means to achieve productivity is to use Banach's fixed-point theorem, which guarantees the unique existence of solutions to recursive equations on metric spaces under certain conditions. Functions satisfying these conditions are called contractions. In this article, we give a new characterization of contractions on streams in the form of a sound and complete representation theorem, and generalize this result to a wide class of non-well-founded structures, first to infinite binary trees, then to final coalgebras of container functors.
These results have important potential applications in functional programming, where coinduction and corecursion are successfully deployed to model continuous reactive systems, dynamic interactivity, signal processing, and other tasks that require flexible manipulation of non-well-founded data. Our representation theorems provide a definition paradigm to compactly compute with such data and easily reason about them.
Graham Hutton and Patrick Bahr. Proceedings of WadlerFest, A List of Successes That Can Change the World, Edinburgh, Scotland, April 2016.
In the field of program transformation, one often transforms programs into continuation-passing style to make their flow of control explicit, and then immediately removes the resulting continuations using defunctionalisation to make the programs first-order. In this article, we show how these two transformations can be fused together into a single transformation step that cuts out the need to first introduce and then eliminate continuations. Our approach is calculational, uses standard equational reasoning techniques, and is widely applicable.
Patrick Bahr and Graham Hutton. Journal of Functional Programming, volume 25, September 2015.
In this article we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high-level semantics by systematic calculation, with all details of the implementation of the compilers falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language features and their combination, including arithmetic expressions, exceptions, state, various forms of lambda calculi, bounded and unbounded loops, non-determinism, and interrupts. All the calculations in the article have been formalised using the Coq proof assistant, which serves as a convenient interactive tool for developing and verifying the calculations.
Jennifer Hackett and Graham Hutton. Proceedings of the Thirtieth Annual ACM/IEEE Symposium on Logic in Computer Science, Kyoto, Japan, July 2015.
Write down the definition of a recursion operator on a piece of paper. Tell me its type, but be careful not to let me see the operator's definition. I will tell you an optimization theorem that the operator satisfies. As an added bonus, I will also give you a proof of correctness for the optimisation, along with a formal guarantee about its effect on performance. The purpose of this paper is to explain these tricks.
Jonathan Fowler and Graham Hutton. To appear in the volume of selected papers from the Symposium on Trends in Functional Programming, Sophia Antipolis, France, June 2015.
When testing a program, there are usually some parts that are rarely executed and hence more difficult to test. Finding inputs that guarantee that such parts are executed is an example of a reach problem, which in general seeks to ensure that targeted parts of a program are always executed. In previous work, Naylor and Runciman have developed a reachability solver for Haskell, based on the use of lazy narrowing from functional logic programming. Their work was focused on practical issues concerning implementation and performance. In this paper, we lay the groundwork for an underlying theory of such a system, by formally establishing the correctness of a simple reach solver.
Jennifer Hackett and Graham Hutton. Proceedings of the 19th ACM SIGPLAN International Conference on Functional Programming, Gothenburg, Sweden, September 2014.
Much research in program optimization has focused on formal approaches to correctness: proving that the meaning of programs is preserved by the optimisation. Paradoxically, there has been comparatively little work on formal approaches to efficiency: proving that the performance of optimized programs is actually improved. This paper addresses this problem for a general-purpose optimization technique, the worker/wrapper transformation. In particular, we use the call-by-need variant of improvement theory to establish conditions under which the worker/wrapper transformation is formally guaranteed to preserve or improve the time performance of programs in lazy languages such as Haskell.
Neil Sculthorpe and Graham Hutton. Journal of Functional Programming, 24(1):113-127, January 2014.
The worker/wrapper transformation is a general-purpose technique for refactoring recursive programs to improve their performance. The two previous approaches to formalising the technique were based upon different recursion operators and different correctness conditions. In this article we show how these two approaches can be generalised in a uniform manner by combining their correctness conditions, extend the theory with new conditions that are both necessary and sufficient to ensure the correctness of the worker/wrapper technique, and explore the benefits that result. All the proofs have been mechanically verified using the Agda system.
Powerpoint slides;
Formalisation in Agda;
Extended
version (with worked examples and all the proofs).
Jennifer Hackett, Graham Hutton and Mauro Jaskelioff. Proceeeings of the 25th Symposium on Implementation and Application of Functional Languages, Nijmegen, The Netherlands, August 2013.
This paper presents a new approach to optimising corecursive programs by factorisation. In particular, we focus on programs written using the corecursion operator unfold. We use and expand upon the proof techniques of guarded coinduction and unfold fusion, capturing a pattern of generalising coinductive hypotheses by means of abstraction and representation functions. The pattern we observe is simple, has not been observed before, and is widely applicable. We develop a general program factorisation theorem from this pattern, demonstrating its utility with a range of practical examples.
Laurence E. Day and Graham Hutton. Proceedings of the 25th Symposium on Implementation and Application of Functional Languages, Nijmegen, The Netherlands, August 2013.
In previous work, we proposed a new approach to the problem of implementing compilers in a modular manner, by combining earlier work on the development of modular interpreters using monad transformers with the à la carte approach to modular syntax. In this article, we refine and extend our existing framework in a number of directions. In particular, we show how generalised algebraic datatypes can be used to support a more modular approach to typing individual language features, we increase the expressive power of the framework by considering mutable state, variable binding, and the issue of noncommutative effects, and we show how the Zinc Abstract Machine can be adapted to provide a modular universal target machine for our modular compilers.
Haskell source code is also available.
Graham Hutton and Mauro Jaskelioff. October 2011.
Streams, or infinite lists, have many applications in functional programming, and are naturally defined using recursive equations. But how do we ensure that such equations make sense, i.e. that they actually produce well-defined streams? In this article we present a new approach to this problem, based upon the topological notion of contractive functions on streams. In particular, we give a sound and complete representation theorem for contractive functions on streams, illustrate the use of this theorem as a practical means to produce well-defined streams, and show how the efficiency of the resulting definitions can be improved using another representation of contractive functions.
Powerpoint slides;
Haskell source code;
Extended version (with all the proofs);
Video lecture.
Laurence E. Day and Graham Hutton. Trends in Functional Programming volume 12 (Ricardo Pena and Rex Page, editors), LNCS 7193, Springer, March 2012. Selected papers from the 12th International Symposium on Trends in Functional Programming, Madrid, Spain, May 2011. Received the award for Best Student Paper.
Compilers are traditionally factorised into a number of separate phases, such as parsing, type checking, code generation, etc. However, there is another potential factorisation that has received comparatively little attention: the treatment of separate language features, such as mutable state, input/output, exceptions, concurrency and so forth. In this article we focus on the problem of modular compilation, in which the aim is to develop compilers for separate language features independently, which can then be combined as required. We summarise our progress to date, issues that have arisen, and further work.
Haskell source code is also available.
Graham Hutton, Mauro Jaskelioff, and Andy Gill. Journal of Functional Programming Special Issue on Generic Programming, 20(3&4):353-373, June 2010.
The worker/wrapper transformation is a general technique for improving the performance of recursive programs by changing their types. The previous formalisation (Gill & Hutton, 2009) was based upon a simple fixed point semantics of recursion. In this article we develop a more structured approach, based upon initial algebra semantics. In particular, we show how the worker/wrapper transformation can be applied to programs defined using the structured pattern of recursion captured by fold operators, and illustrate our new technique with a number of examples.
An extended article (with all the proofs) is also available.
Liyang Hu and Graham Hutton. Trends in Functional Programming volume 10 (Zoltan Horvath and Viktoria Zsok, editors), Intellect, September 2010. Selected papers from the Tenth Symposium on Trends in Functional Programming, Komarno, Slovakia, June 2009.
The standard approach to proving compiler correctness for concurrent languages requires the use of multiple translations into an intermediate process calculus. We present a simpler approach that avoids the need for such an intermediate language, using a new method that allows us to directly establish a bisimulation between the source and target languages. We illustrate the technique on two small languages, using the Agda system to present and formally verify our compiler correctness proofs.
Andy Gill and Graham Hutton. Journal of Functional Programming, 19(2):227-251, March 2009. Invited submission.
The worker/wrapper transformation is a technique for changing the type of a computation, usually with the aim of improving its performance. It has been used by compiler writers for many years, but the technique is little-known in the wider functional programming community, and has never been described precisely. In this article we explain, formalise, and explore the generality of the worker/wrapper transformation. We also provide a systematic recipe for its use as an equational reasoning technique for improving the performance of programs, and illustrate the power of this recipe using a range of examples.
Powerpoint slides and an extended article (with all the proofs) are also available.
Mauro Jaskelioff, Neil Ghani, and Graham Hutton. Electronic Notes in Theoretical Computer Science, volume 229, issue 5, pages 75-95, March 2011. Proceedings of the Second Workshop on Mathematically Structured Functional Programming, Reykjavik, Iceland, July 2008.
Structural operational semantics is a popular technique for specifying the meaning of programs by means of inductive clauses. One seeks syntactic restrictions on those clauses so that the resulting operational semantics is well-behaved. This approach is simple and concrete but it has some drawbacks. Turi pioneered a more abstract categorical treatment based upon the idea that operational semantics is essentially a distribution of syntax over behaviour. In this article we take Turi's approach in two new directions. Firstly, we show how to write operational semantics as modular components and how to combine such components to specify complete languages. Secondly, we show how the categorical nature of Turi's operational semantics makes it ideal for implementation in a functional programming language such as Haskell.
Haskell source code is also available.
Liyang Hu and Graham Hutton. Trends in Functional Programming volume 9 (Peter Achten, Pieter Koopman, and Marco Morazan, editors), Intellect, July 2009. Selected papers from the Ninth Symposium on Trends in Functional Programming, Nijmegen, The Netherlands, May 2008.
In recent years there has been much interest in the idea of concurrent programming using transactional memory, for example as provided in STM Haskell. While programmers are provided with a simple high-level model of transactions in terms of a stop-the-world semantics, the low-level implementation is rather more complex, using subtle optimisation techniques to execute multiple concurrent transactions efficiently, which is essential to the viability of the programming model. In this article, we take the first steps towards a formally verified implementation of transactional memory. In particular, we present a stripped-down, idealised concurrent language inspired by STM Haskell, and show how a low-level implementation of this language can be justified with respect to a high-level semantics, by means of a compiler and its correctness theorem, mechanically tested using QuickCheck and HPC. The use of these tools proved to be invaluable in the development of our formalisation.
Graham Hutton and Diana Fulger. Proceedings of the Symposium on Trends in Functional Programming, Nijmegen, The Netherlands, May 2008.
Pure functional languages support programming with impure effects by exploiting mathematical notions such as monads, applicative functors, and arrows. However, in contrast to the wealth of research on the use of these notions to write effectful programs, there has been comparatively little progress on reasoning about the resulting programs. In this article we focus on this problem, using a simple but instructive example concerned with relabelling binary trees.
An extended article (with all the proofs) is also available.
Graham Hutton and Joel Wright. Graham Hutton. Journal of Functional Programming, 17(6):777-792, Cambridge University Press, November 2007.
Asynchronous exceptions, or interrupts, are important for writing robust, modular programs, but are traditionally viewed as being difficult from a semantic perspective. In this article we present a simple, formally justified, semantics for interrupts. Our approach is to show how a high-level semantics for interrupts can be justified with respect to a low-level implementation, by means of a compiler and its correctness theorem. In this manner we obtain two different perspectives on the problem, formally shown to be equivalent, which gives greater confidence in the correctness of our semantics.
Powerpoint slides, Haskell source code and an extended article (with all the proofs) are also available.
Graham Hutton. Cambridge University Press, January 2007.
Haskell is one of the leading languages for teaching functional programming, enabling students to write simpler and cleaner code, and to learn how to structure and reason about programs. This introduction is ideal for beginners: it requires no previous programming experience and all concepts are explained from first principles via carefully chosen examples. Each chapter includes exercises that range from the straightforward to extended projects, plus suggestions for further reading on more advanced topics. The presentation is clear and simple, and benefits from having been refined and class-tested over several years. The result is a text that can be used with courses, or for self-learning. Features include: freely accessible powerpoint slides for each chapter; solutions to exercises, and examination questions (with solutions) available to instructors; downloadable code that's fully compliant with the latest Haskell release.
Catherine Hope and Graham Hutton. Proceedings of the Workshop on Mathematically Structured Functional Programming, Kuressaare, Estonia, July 2006.
There are many advantages to writing functional programs in a compositional style, such as clarity and modularity. However, the intermediate data structures produced may mean that the resulting program is inefficient in terms of space. These may be removed using deforestation techniques, but whether the space performance is actually improved depends upon the structures being consumed in the same order that they are produced. In this paper we explore this problem, using examples where the intermediate structure is a list, and present a solution. We then formalise these results by using abstract machines to expose the underlying data structures involved.
Catherine Hope and Graham Hutton. Implementation and Application of Functional Languages, Lecture Notes in Computer Science volume 4015, Springer Berlin / Heidelberg, 2006. Selected papers from the 17th International Workshop on Implementation and Application of Functional Languages, Dublin, Ireland, September 2005.
Starting with an evaluator for a language, an abstract machine for the same language can be mechanically derived using successive program transformations. This has relevance to studying both the space and time properties of programs because these can be estimated by counting transitions of the abstract machine and measuring the size of the additional data structures needed, such as environments and stacks. In this article we use this process to derive a function that accurately counts the number of steps required to evaluate expressions in a simple language.
Graham Hutton, editor. Bulletin of the European Association for Theoretical Computer Science, number 86, pages 241-256, June 2005.
This report contains edited abstracts from BCTCS 2005, which was held on 22nd to 24th March 2005 in Nottingham, England.
Jeremy Gibbons and Graham Hutton. Fundamenta Informaticae Special Issue on Program Transformation, 66(4):353-366, IOS Press, April-May 2005.
Recursion is a well-known and powerful programming technique, with a wide variety of applications. The dual technique of corecursion is less well-known, but is increasingly proving to be just as useful. This article is a tutorial on the four main methods for proving properties of corecursive programs: fixpoint induction, the approximation (or take) lemma, coinduction, and fusion.
Graham Hutton and Joel Wright. Trends in Functional Programming volume 5 (Hans-Wolfgang Loidl, editor), Intellect, February 2006. Selected papers from the Fifth Symposium on Trends in Functional Programming, Munich, November 2004.
In previous work we showed how to verify a compiler for a small language with exceptions. In this article we show how to calculate, as opposed to verify, an abstract machine for this language. The key step is the use of Reynold's defunctionalization, an old program transformation technique that has recently been rejuvenated by the work of Danvy et al.
Extended version
(with all the calculations);
Powerpoint slides;
Video lecture.
Graham Hutton and Joel Wright. Proceedings of the 7th International Conference on Mathematics of Program Construction, Lecture Notes in Computer Science volume 3125, Springer, July 2004.
Exceptions are an important feature of modern programming languages, but their compilation has traditionally been viewed as an advanced topic. In this article we show that the basic method of compiling exceptions using stack unwinding can be explained and verified both simply and precisely, using elementary functional programming techniques. In particular, we develop a compiler for a small language with exceptions, together with a proof of its correctness.
Extended version (with all the proofs);
Formalisation in Isabelle (by Tobias Nipkow);
Powerpoint slides;
Haskell source code;
Version with jumps.
Graham Hutton. Lecture notes of the MGS-APPSEM Spring School, Nottingham, England, March 2004.
Functional programming is a style of programming in which the basic method of computation is the application of functions to arguments. The aim of this introductory course is to review some of the basics of functional programming, using the modern functional language Haskell. The course comprises ten mini-lectures, two extended programming examples, and a take-home exam.
Graham Hutton, editor. University of Nottingham, March 2003.
This volume contains the proceedings of the First APPSEM-II Workshop, which was held on 26th to 28th March 2003 in Nottingham, England.
Graham Hutton. Journal of Functional Programming, 12(6):609-616, Cambridge University Press, November 2002.
We systematically develop a functional program that solves the countdown problem, a numbers game in which the aim is to construct arithmetic expressions satisfying certain constraints. Starting from a formal specification of the problem, we present a simple but inefficient program that solves the problem, and prove that this program is correct. We then use program fusion to calculate an equivalent but more efficient program, which is then further improved by exploiting arithmetic properties.
Press release;
Talk slides (ppt,
wav);
Haskell source code;
Version with timing information;
Video lecture (guest lecture in Microsoft's series on Programming in Haskell).
This work has been mentioned in a number of national and local newspapers, including the Times, the Times Higher Education Supplement, the London Evening Standard, the Nottingham Evening Post, the University of Nottingham Newsletter, and the University of Nottingham Magazine.
Graham Hutton, editor. Journal of Functional Programming, 12(4&5):293-510, Cambridge University Press, July 2002.
This special double issue of the Journal of Functional Programming is devoted to Haskell, and follows on from a series of four workshops on the language that were held during the period 1995-2000.
Graham Hutton, editor. Electronic Notes in Theoretical Computer Science, 41(1), Elsevier Science, August 2001.
This volume contains the proceedings of the Haskell Workshop 2000, which was held on 17th September 2000 in Montreal, Canada. The Workshop was sponsored by ACM SIGPLAN and formed part of the PLI 2000 colloquium on Principles, Logics, and Implementations of high-level programming languages, which comprised the ICFP/PPDP conferences and associated workshops.
Graham Hutton and Jeremy Gibbons. Information Processing Letters, 79(4):197-201, Elsevier Science, August 2001.
The approximation lemma is a simplification of the well-known take lemma, and is used to prove properties of programs that produce lists of values. We show how the approximation lemma, unlike the take lemma, can naturally be generalised from lists to a large class of datatypes, and present a generic approximation lemma that is parametric in the datatype to which it applies. As a useful by-product, we find that generalising the approximation lemma in this way also simplifies its proof.
Jeremy Gibbons, Graham Hutton, and Thorsten Altenkirch. Proceedings of the 4th International Workshop on Coalgebraic Methods in Computer Science, Electronic Notes in Theoretical Computer Science, Volume 44.1, April 2001.
We give a necessary and sufficient condition for when a set-theoretic function can be written using the recursion operator fold, and a dual condition for the recursion operator unfold. The conditions are simple, practically useful, and generic in the underlying datatype.
Jeremy Gibbons and Graham Hutton. Proceedings of the 1st Scottish Functional Programming Workshop, Stirling, Scotland, August 1999.
Corecursive programs produce values of greatest fixpoint types, in contrast to recursive programs, which consume values of least fixpoint types. There are a number of widely used methods for proving properties of corecursive programs, including fixpoint induction, the take lemma, and coinduction. However, these methods are all rather low-level, in that they do not exploit the common structure that is often present in corecursive definitions. We argue for a more structured approach to proving properties of corecursive programs. In particular, we show that by writing corecursive programs using a simple operator that encapsulates a common pattern of corecursive definition, we can then use high-level algebraic properties of this operator to conduct proofs in a purely calculational style that avoids the use of inductive or coinductive methods.
A journal version of this article with a broader emphasis is also available.
Graham Hutton. Journal of Functional Programming, 9(4):355-372, Cambridge University Press, July 1999.
In functional programming, fold is a standard operator that encapsulates a simple pattern of recursion for processing lists. This article is a tutorial on two key aspects of the fold operator for lists. First of all, we emphasize the use of the universal property of fold both as a proof principle that avoids the need for inductive proofs, and as a definition principle that guides the transformation of recursive functions into definitions using fold. Secondly, we show that even though the pattern of recursion encapsulated by fold is simple, in a language with tuples and functions as first-class values the fold operator has greater expressive power than might first be expected.
Graham Hutton. Proceedings of the 3rd ACM SIGPLAN International Conference on Functional Programming, Baltimore, Maryland, September 1998.
In this paper we explain how recursion operators can be used to structure and reason about program semantics within a functional language. In particular, we show how the recursion operator fold can be used to structure denotational semantics, how the dual recursion operator unfold can be used to structure operational semantics, and how algebraic properties of these operators can be used to reason about program semantics. The techniques are explained with the aid of two main examples, the first concerning arithmetic expressions, and the second concerning Milner's concurrent language CCS. The aim of the paper is to give functional programmers new insights into recursion operators, program semantics, and the relationships between them.
Graham Hutton and Erik Meijer. Journal of Functional Programming, 8(4):437-444, Cambridge University Press, July 1998.
This paper is a tutorial on defining recursive descent parsers in Haskell. In the spirit of one-stop shopping, the paper combines material from three areas into a single source. The three areas are functional parsers, the use of monads to structure functional programs, and the use of special syntax for monadic programs in Haskell. More specifically, the paper shows how to define monadic parsers using do notation in Haskell. The paper is targeted at the level of a good undergraduate student who is familiar with Haskell, and has completed a grammars and parsing course. Some knowledge of functional parsers would be useful, but no experience with monads is assumed.
A monadic parsing library in Haskell.
An extended version of this paper is available as a technical report.
Graham Hutton and Erik Meijer. Technical Report NOTTCS-TR-96-4, Department of Computer Science, University of Nottingham, 1996.
In functional programming, a popular approach to building recursive descent parsers is to model parsers as functions, and to define higher-order functions (or combinators) that implement grammar constructions such as sequencing, choice, and repetition. Such parsers form an instance of a monad, an algebraic structure from mathematics that has proved useful for addressing a number of computational problems. The purpose of this report is to provide a step-by-step tutorial on the monadic approach to building functional parsers, and to explain some of the benefits that result from exploiting monads. No prior knowledge of parser combinators or of monads is assumed. Indeed, this report can also be viewed as a first introduction to the use of monads in programming.
A simple monadic parsing library in Gofer;
A more sophisticated library in Gofer.
A condensed version of this report appears as a functional pearl in JFP.
Graham Hutton and Erik Meijer. Journal of Functional Programming, 6(1):181-188, Cambridge University Press, January 1996.
Many functional programs can be viewed as representation changers, that is, as functions that convert abstract values from one concrete representation to another. Examples of such programs include base-converters, binary adders and multipliers, and compilers. In this paper we give a number of different approaches to specifying representation changers (pointwise, functional, and relational), and present a simple technique that can be used to derive functional programs from the specifications.
Erik Meijer and Graham Hutton. Proceedings of the 7th SIGPLAN-SIGARCH-WG2.8 International Conference on Functional Programming and Computer Architecture, ACM Press, La Jolla, California, June 1995.
Fold and unfold are general purpose functionals for processing and constructing lists. By using the categorical approach of modelling recursive datatypes as fixed points of functors, these functionals and their algebraic properties were generalised from lists to polynomial (sum-of-product) datatypes. However, the restriction to polynomial datatypes is a serious limitation: it precludes the use of exponentials (function-spaces), whereas it is central to functional programming that functions are first-class values, and so exponentials should be able to be used freely in datatype definitions. In this paper we explain how Freyd's work on modelling recursive datatypes as fixed points of difunctors shows how to generalise fold and unfold from polynomial datatypes to those involving exponentials.
Graham Hutton, Review of Mike Gordon and Tom Melham (editors), An introduction to HOL, a theorem proving environment for higher order logic. Journal of Functional Programming, 4(4):557-559, Cambridge University Press, October 1994.
Carolyn Brown and Graham Hutton. Proceedings of the 10th Annual IEEE symposium on Logic in Computer Science, IEEE Computer Society Press, Los Alamitos, California, July 1994.
Languages based upon binary relations offer an appealing setting for constructing programs from specifications. For example, working with relations rather than functions allows specifications to be more abstract (for example, many programs have a natural specification using the converse operator on relations), and affords a natural treatment of non-determinism in specifications. In this paper we present a novel pictorial interpretation of relational terms as simple pictures of circuits, and a soundness/completeness result that allows relational equations to be proved by pictorial reasoning.
Graham Hutton, Erik Meijer, and Ed Voermans. Distributed on the MOP (mathematics of programming) electronic mailing list, January 1994.
This note describes an implementation of a decision procedure for provable equality of terms in an allegory with products. The tool is implemented in Gofer, and is based upon a soundness/completeness result of Brown and Hutton. The Gofer code was originally developed as a bit of fun, but seems to be practically useful, so we are making it available to other relational programmers. It is our hope that the tool will be a useful concrete aid when first learning about allegories, and of practical use by researchers in verifying simple allegorical equations without the labour of axiomatic proofs.
Graham Hutton. Research Report 72, Chalmers University of Technology, May 1993.
Ruby is a relational calculus for designing digital circuits. This document is a guide to the Ruby interpreter, which allows a special class of ``implementable'' Ruby programs to be executed. The Ruby interpreter is written in the functional programming language Lazy ML, and is used under the interactive Lazy ML system.
Graham Hutton. Lecture Notes of the STOP Summer School on Constructive Algorithmics, Ameland, The Netherlands, September 1992.
This article is an introduction to the use of relational calculi in deriving programs. Using the relational calculus Ruby, we derive a functional program that adds one bit to a binary number to give a new binary number. The resulting program is unsurprising, being the standard ``column of half-adders'', but the derivation illustrates a number of points about working with relations rather than with functions.
Graham Hutton. PhD thesis, University of Glasgow, October 1992. Available as Research Report FP-93-5.
This thesis is about the calculational approach to programming, in which one derives programs from specifications. One such calculational paradigm is Ruby, a relational calculus for designing digital circuits. We identify two shortcomings with derivations made using Ruby. The first is that the notion of a program being an implementation of a specification has never been made precise. The second is that certain type assertions that arise during derivations have been verified either by informal arguments or by using predicate calculus, rather than by applying algebraic laws from Ruby.
In this thesis we address both of the shortcomings noted above. We define what it means for a Ruby program to be an implementation, by introducing the notion of a causal relation, and the network denoted by a program. Moreover, we present an interpreter for programs that are implementations. We show how to verify type assertions within Ruby by using algebraic properties of operators that give the best left and right types for a relation.
Graham Hutton. Journal of Functional Programming, 2(3):323-343, Cambridge University Press, July 1992.
In combinator parsing, the text of parsers resembles BNF notation. We present the basic method, and a number of extensions. We address the special problems presented by white-space, and parsers with separate lexical and syntactic phases. In particular, a combining form for handling the offside rule is given. Other extensions to the basic method include an ``into'' combining form with many useful applications, and a simple means by which combinator parsers can produce more informative error messages.
A parsing library in Gofer;
A parsing library in Lazy ML.
More recent articles that explain the monadic approach to functional parsers are also available, in the form of a short JFP article, or as an extended technical report.
Graham Hutton and Ed Voermans. Research Report 1992/R1, Department of Computing Science, University of Glasgow, 1992.
In the calculational approach to programming, programs are derived from specifications by algebraic reasoning. This report presents a calculational programming framework based upon the notion of binary relations as programs, and partial equivalence relations (pers) as types. Working with relations as programs generalises the functional paradigm, admiting non-determinism and the use of relation converse. Working with pers as types permits a natural treatment of types that are subject to laws and restrictions.
Graham Hutton and Ed Voermans. Proceedings of the 1991 Glasgow Workshop on Functional Programming (Portree, Scotland), Springer-Verlag Series of Workshops in Computing, Springer-Verlag, Berlin, 1992.
The definition for the notion of a "function" is not cast in stone, but depends upon what we adopt as types in our language. With partial equivalence relations (pers) as types in a relational language, we show that the functional relations are precisely those satisfying the simple equation f = f o fu o f, where "o" and "u" are respectively the composition and converse operators for relations. This article forms part of "A calculational theory of pers as types".
Graham Hutton. Proceedings of the 1990 Glasgow Workshop on Functional Programming (Ullapool, Scotland), Springer-Verlag Series of Workshops in Computing, Springer-Verlag, Berlin, 1991.
While programming in a relational framework has much to offer over the functional style in terms of expressiveness, computing with relations is less efficient, and more semantically troublesome. In this paper we propose a novel blend of the functional and relational styles. We identify a class of "causal relations", which inherit some of the bi-directionality properties of relations, but retain the efficiency and semantic foundations of the functional style.
Graham Hutton. Proceedings of the 1989 Glasgow Workshop on Functional Programming (Fraserburgh, Scotland), Springer-Verlag Series of Workshops in Computing, Springer-Verlag, Berlin, 1990.
In combinator parsing, the text of parser resembles BNF notation. We present the basic method, and show how it may be extended to more practical parsers. We address the special problems of layout, and parsers with separate lexical and syntactic phases. In particular, an elegant new way of handling the offside rule is given. Many examples and comments are given.
(This article is not available in electronic form.)