The lecturers of some of the graduate courses were kind enough to let me join their courses before enrolling in a postgrad degree (or maybe I just didn't correct their assumption...). Here is an overview.
I have also taken the liberty to add some comments on why these areas of study are interesting, how they came to be, as well as some connection to other subjects and applications. The courses roughly reflect my interest over time, starting with (algebraic) geometry/topology, over functional analysis and operator theory, and finally stochastic processes.
Summer 2019, Grade: 1.0
The abstract study of commutative algebra was mainly motivated by its application to algebraic geometry.
The course covered:
Hilbert's Nullstellen- & Basissatz, Noetherian & Artinian Rings,
Zariski Topology on Concrete & Abstract Spectra,
Krull Dimension & Transcendence Degree,
Localization, Nakayama's Principal Ideal Theorem, Integral Extensions & Noether Normalization,
Hilbert Polynomials & Dimension, Associated Graded Algebra,
Regular Local Rings & Smoothness,
Tensor Products & Basic Homological Algebra
Winter 2019, Grade: 1.0
Despite its name, algebraic topology (AT) is more than topology with algebraic methods. While point-set topology is concerned with very fine properties (separability, completeness, countability, convergence, ...) of spaces, AT deals with more global phenomena such as (co-)homology, top. K-theory, homotopy, and characteristic classes. Locally, the spaces considered usually look very similar, such as for (smooth) manifolds, vector bundles of such, CW complexes, or normed algebras, but their global topology does not coincide.
History: The origins of the field can be traced back to Euler and his considerations of polyhedra. The Euler characteristic (1758) can be seen as the first topological invariant of a space. Major developments happened in the first half of the 20th century when H. Poincare constructed what is today known as simplicial homology for polyhedra. In the subsequent years, this notion was extended to more general spaces, ultimately culminating in singular homology theory introduced by S. Eilenberg. Together with N.E. Steenrod, he eventually proposed an axiomatised theory of (co-) homology, and together with S. Mac Lane, he laid the foundation for category theory in order to study various (co-)homology theories.
Inner workings: The inner workings of the theory mostly revolve around identifying topological or homotopical invariants that can be used to distinguish space and analysing these with algebraic methods (such as homological and homotopical algebra). Since being an invariant essentially means being a functor, the theory lends itself very well to category theoretical methods. However, in many classical aspects AT uses combinatorial methods (such as simplicial sets and complexes) to simplify classification problems (e.g. homotopy can be studied entirely on CW complexes).
Application in Pure Mathematics: AT finds application where ever topological spaces need to be distinguished by their global properties. Since topological spaces and classification problems are so ubiquitous in mathematics, this happens quite often. Concrete examples are algebraic (e.g. sheave cohomology, l-adic cohomology) and differential geometry (e.g. Chern classes, deRham cohomology) or operator algebras (e.g. operator K-theory).
Application in Applied Mathematics: Today, mathematical physics makes great use of AT in that many physical problems, especially in fundamental physics can be realized topologically. Classical results on de Rham cohomology are used electromagnetism, whereas modern methods in AT are used in quantum field theories. More recent applications include topological data analysis, where (persistent) homology can be used to identify clusters in data or to recover shapes from incomplete information.
Winter 2019, Grade: 1.0
Functional analysis is concerned with function spaces (e.g. Lebesgue spaces and Sobolev spaces, or spaces of continuous functions on compacta), which are (usually) infinite dimensional vector spaces with a (often normed) topology, whose elements are functions. The field gives a more structural approach to classical topics in analysis such as PDE and integral equations and provides a mathematical framework for quantum mechanics.
The primary tool in analysing these spaces are so-called functionals which are continuous linear functions into the base field (R or C), and form function spaces themselves, the continuous dual space. These lead to striking duality theorems (such as the Fréchet-Riesz theorem for Hilbert spaces, Riesz-Markov representation theorems for spaces of continuous functions), asserting that seemingly inaccessible dual spaces can be realized as very concrete function spaces
The field originates in the classical problems of ODE, PDE, integral equations, in particular J. Fouriers considerations of the heat equation and what is now known as the Fourier transform in 1822. A lot of abstract development came with the polish school of analysis up until 1941, involving S. Banach and J. Schauder.
Summer 2020, Grade: 1.0
p-adic Numbers & Integers, Ostrowski's Theorem, Riesz-Markov Representation Theorem, Haar Measure: Existence & Uniqueness via Izzo's Fixed Point Argument, Topological Vector Spaces, Markov-Kakutani Theorem, Modular Function, Unitary Representation Theory, Schur's Lemma, Induced *-Representation on L1, Measure Algebra, Abstract Fourier Transformation, Riemann-Lebesgue Theorem, Ascoli's Theorem, Stone-Weierstraß, Bochner's Theorem, Plancherel's Theorem, Fourier Inversion, Pontrjagin Duality
Advanced Topic in Algebraic Topology
Summer 2020, Grade: 1.7
Commutative Differential Graded Algebras, Stoke's Theorem, Poincare Lemma & Homotopy Invariance, Poincare Duality, Künneth Formula, Bundles & G-Torsors, Poincare-Hopf Index Theorem & Characteristic Classes, Thom Isomorphism
Summer 2020, Grade: 1.0
Operator Theory (OT) is concerned with (bounded or unbounded) linear operators between (usually normed) topological vector spaces, whereas functional analysis deals with operators to the ground field i.e. with functionals. It has its roots at the beginning of the 20th century in the study of integral equations by Fredholm and subsequently in the mathematical development of quantum mechanics.
The inner machinery of the theory mostly deals with abstractly studying properties of operators that naturally come up in the context of integral equations (e.g. Volterra), in ODE & PDE (e.g. Sturm–Liouville theory, Laplacians), or quantum mechanics (e.g. momentum & energy operators). This is done by systematically weakening and strengthening the concepts of continuity.
A cornerstone of the subject is the use of so-called functional calculi: Similar to how the sum and multiplication of two operators have a very natural interpretation, other functional expressions such as f(A), where f is some function and A is a linear operator have meaning. The first example of this one encounters is probably the structure theorem for solutions of a homogeneous system of linear differential equations, where exp(tA) shows up. The more well behaved the operator A, the worse the functions can behave which can be evaluated at A, and vice versa. A theory concerning which classes of functions can be evaluated at which types of operators is known as a functional calculus.
Another reason for their study is that functional calculi interact very nicely with spectra of linear operators, which are the infinite-dimensional analogous of eigenvalues of matrices. They are particularly important in the study of quantum mechanics, where observables are modelled by linear operators on a Hilbert space and their possible values by the eigenvalues of that operator.
Application: OT is the language of quantum mechanics and finds a lot of application there. On the other hand, it naturally shows up whenever function spaces are systematically studied, which happens in almost every sub-discipline.
Summer 2020, Grade: 1.7
Mathematical probability theory is concerned with phenomena that either cannot be properly modelled by deterministic quantities (such as dispersion or financial markets) or are fundamentally random (such as quantum systems).
History: Classical considerations of probability and chance date back to the sixteenth century, but its modern formulation in terms of measure theory wasn’t developed until the 1930s by A. Kolmogorov. The latter approach postulates the existence of a probability space consisting of a sample set, a sigma-algebra of measurable events, and a probability measure, as well as what is called a random variable: a measurable function from the sample space into some measure space of outcomes (often the real numbers). The sample space can be interpreted as all possible states the universe could be in when the random experiment happens. The sigma-algebra indicates for which events it is decidable whether the event happened or not by just observing the random variable, whereas the probability measure gives the probability that a certain state of the universe happens. Finally, the random variable itself represents the random experiment, assigning to each state in the sample space an outcome. The problem here is that the probability space is very inaccessible, which is why, instead of taking the probability space as the central object of study, one considers a probability measure on the target space of the random variable: its distribution. The distribution of a random variable indicates with what probability a certain outcome of an experiment happens. For example, a random variable could be normally distributed.
As opposed to statistics, which aims to make statements about the underlying random process which generates data, probability theory is concerned with making predictions about the data given by the random process. In a sense, they are inverse to one another.
Inner workings: The main goal of PT is to deduce not necessarily concrete probabilities of specific events happening or not, but rather to prove that an event will certainly happen or not. Many probabilistic systems are that way because they depend on many variables, each for themselves is next to impossible to control. However, all taken together show predictable behaviour. For example, whether a fair coin toss returns head or tail is not predictable, but with probability 1 the sample average of more and more coin tosses is 50% head and 50% tails. This is done by limit theorems such as the central limit theorem or the law of large numbers.
Application in Pure Mathematics: Apart from its obvious role in mathematical statistics, PT can be employed in very pure areas of research as well such as the “probabilistic method” in combinatorics. Pioneered by P. Erdös, the existence of an object A can be proven by showing that choosing an object at random returns A with positive probability.
Application in Applied Mathematics: From computer science, chemistry and physics to biology, life sciences and economics, PT always find application when information is random or appears to be random.
Seminar: From Homotopical Algebra Towards Higher Categories (Visiting)
Chain Homotopies & Quasi Isomorphisms, Model Categories from Localization, Quillen equivalence of Top and sSet, Projective/Injective Model Structure on Functors, Quasi-Categories/(infinity, 1)-categories, (Co)Limits in Quasi-Categories, Derived Algebraic Geometry/Derived Deformation Theory, Dold-Kan Correspondence
Winter 2020, Grade: 1.3
Martingales in continuous time, Brownian motion: construction and properties, Donsker's invariance principle, Stochastic integrals, Ito's formula, stochastic differential equations, Girsanov's theorem