The schedule is available here. All courses and exercise classes take place either in the School of Computer Science or in Business School South, Jubilee Campus, buildings number 4 and 7 respectively on the campus map. Lunches are served in the Atrium Restaurant, building number 5 on the map.

This course will provide a short introduction to category theory. It will assume some basic undergraduate mathematics (groups, algebra, orders) and computer science (syntax, basic programming, types) that will be used to give examples. I will cover categories, functors, natural transformations, products and coproducts, and adjunctions. I hope to present some additional material but that will depend on the time available.

Further details and slides here.

This course will introduce coalgebras as an abstraction of automata, transition systems and similar models of computation and concurrency. We will show how a category theoretical analysis of simple examples leads to the notion of a coalgebra and will explore the scope of this generalisation, where notions such as bisimulation and coinduction find their natural home.

As to prerequisites, we will introduce the necessary category theoretic concepts as we need them, but will not have the time to explain them in all detail. Familiarity with basic category theory (functors, natural transformations, (co)limits, adjoints) therefore will be an advantage.

Lecture notes available here.

In this course we will focus on the denotational semantics of the simply typed lambda calculus. We aim to show the following results:

- the completeness of alpha, beta and eta rules with respect to the set theoretic model over the natural numbers
- the adequacy of the Scott model for PCF
- Milner's context lemma

The first part is based on Chapter 2 of the book “Semantics of Programming Languages” by Carl A. Gunter, MIT Press 1992. The other two on “Domain-Theoretic Foundations of Functional Programming” by Thomas Streicher, World Scientific 2006. Both books contain a lot more material than I can present in this course. The same is true for “Foundations for Programming Languages” by John C. Mitchell, MIT Press 1996. The common feature of these three works and my course is that the simply typed lambda calculus is taken as the starting point, whereas other texts on denotational semantics tend to begin with an (imperative) while-language.

A detailed
outline
of the course together with a rationale for the selection of topics has been
published in the **ACM SIGLOG News**, vol. 1, nr. 2, October 2014,
pages 25–37.

The λ-calculus is a theory of pure functions whose only constructs are function abstraction and application. Alonzo Church invented it in 1936 to answer (negatively) the “decidability problem” posed by David Hilbert. It was the first complete model of computability ever defined, followed shortly after by the equivalent model of Turing machines.

The course gives an introduction to the untyped λ-calculus, its most important properties and techniques to define data structures and to program functions on them. We cover traditional objects like numerals and lists, but also infinite streams and non-terminating computations.

In the second part of the course, we study various systems of typed λ-calculi with increasingly sophisticated and expressive types. We start with the simply typed λ-calculus, which assigns arrow types to functional objects. Next we introduce the rules for the type of natural numbers, giving us Gödel's system T. We generalize these rules to uniform definitions of inductive and coinductive types. Finally we study polymorphic types and Girard's system F.

Lecture material and detailed outlines of the lectures will appear here.

In this lecture series on Machine Learning (ML), I will try to convey the basic concepts of the field, as well as introduce the audience to the currently massively popular area of Deep Learning. To do so, I will loosely base my lectures on those given to our 3d year undergraduates, but at an accelerated pace. The first, two-hour slot, will cover a general introduction to Machine Learning, including linear regression and linear classification. I will describe gradient descent in some detail, as it is a prime example of the types of intrinsic parameter optimisation techniques used in ML. I will end the first lecture with some notes on further reading and pitfalls to avoid, in particular around overfitting and reporting generalisation (i.e. how to properly evaluate a ML system). In the second lecture I will introduce Neural Networks, and how to train them, and the third and final lecture will cover Deep Learning, in particular Convolutional Neural Networks for Computer Vision. The lab sessions will allow you to build a smile detector using classical machine learning systems (e.g Support Vector Machines) and try a guided tutorial of TensorFlow, the popular open-source toolkit for Deep Learning developed and maintained by Google.

Slides:

These lectures introduce separation logic, an extended Hoare logic for the verification of programs with pointers, within an algebraic perspective, and with a view on building verification components in proof assistants such as Coq or Isabelle.

The first lecture studies the program states used in separation logic and their dynamics from the point of view of partial abelian monoids, and puts them into the context of general semigroup theory.

The second one constructs the assertion algebra of separation logic as a convolution algebra over such monoids: a quantale, in which the idiosyncratic operation of separation conjunction is convolution. It also puts this construction into a broader context and describes the subalgebra of intuitionistic assertions in terms of quantic (co)nuclei.

The third lecture presents a quantale-based predicate transformer semantics for separation logic, which is particularly suitable for verification condition generation. Emphasis is on local monotone predicate transformers, which support the derivation of a Hoare logic with a frame rule for separation logic, as well as a framing-based refinement calculus.

The fourth an final lecture discusses the integration of the program state model into the general algebraic semantics and outlines the formalisation of the resulting program construction and verification components in Isabelle/HOL. It also contrasts the concrete predicate transformer semantics with previous state transformer and relational fault models.

The focus of these lectures is thus less on applications and more on a coherent algebraic approach that makes separation logic and similar program verification methods easy to formalise, in a way that is correct by construction. The content of the exercise sessions will depend on the interest of the participants.

Idris is a general purpose functional programming language with first-class dependent types, building on state-of-the-art techniques in programming language research. Idris aims to make type-based program verification techniques accessible to programming practitioners while supporting efficient systems programming via an optimising compiler and interaction with external libraries.

In this course, I will introduce “Type-driven development” an approach to programming in which we give types first, and write programs step by step, with a series of refinements. I will present several examples, showing how to use Idris for verifying realistic and important properties of software, from simple properties such as array bounds checking, to more complex properties of networked and concurrent systems. In the final lecture, I will show progress towards a new implementation of Idris, written in Idris and following principles of type-driven development.

Slides and examples will appear here as the week goes on.

Recommended reading:

- Chapter 1 of Type Driven Development with Idris (chapters 1 & 13 are free downloads)
- The Idris Tutorial

In this course, we will learn about univalent foundations and how to express mathematical reasoning in these foundations. Specifically, we will study:

- Martin-Löf Type Theory (MLTT), the basis of univalent foundations
- The axiomatic extension of MLTT to Univalent Type Theory (UTT): function extensionality, propositional truncation, univalence axiom
- The notions of “proposition” and “set” in UTT
- Mathematics in UTT: algebraic structures and the equivalence principle

Recommended reading:

- HoTT book: comprehensive overview of univalent foundations
- An introduction to univalent foundations for mathematicians (by Dan Grayson): a gentle introduction to univalent foundations