How to Learn Advanced Mathematics Without Heading to University - Part 4

In Part 4 of the series we examine some of the necessary stochastic calculus and statistics modules that are most beneficial to those who wish to become quantitative researchers of developers in an investment bank or systematic hedge fund.

It has been some time since wrote Parts I, II and III of our popular series of articles on How to Learn Advanced Mathematics Without Heading to University. Many of you have contacted us asking for the final Part IV of the series. We have now completed our internal research and can present our view on the most appropriate modules to self-study in lieu of carrying out a structured fourth year of a typical UK four-year undegraduate mathematical masters degree.

Prior to looking at the fourth year modules it is instructive to recap the modules studied in the previous three years.

The first half of our self-taught undergraduate mathematics degree consisted of the core mathematical knowledge needed to tackle later, more specific modules. In particular we emphasised the importance of Analysis (Derivatives, Riemann Integrals and Metric Spaces), Algebra (Group Theory), Linear Algebra (Vector Spaces), Differential Equations (Ordinary and Partial) and Probability.

In the self-study equivalent of a third year we emphasised the importance of Measure Theory and Linear Functional Analysis, within which we learned the underlying concepts necessary to study more rigourous courses in Probability Theory, Stochastic Analysis, Time Series Analysis and Machine Learning.

Now that we have built a strong foundation of mathematical maturity it is time to turn our attention to courses that will help a prospective quant prepare for postgraduate study via a Masters in Financial Engineering (MFE) or make the transition to industry. In particular many of the course suggestions will be typical of those found in the upper end of a statistics-focused mathematics degree or even a pure statistics degree. Some modules will overlap the content taught on MFE courses.

One thing to note about the fourth-year of a typical undergraduate mathematics degree is that the course content will tend to differ significantly between universities. This is largely due to the fact that most of the available modules are based on the research interests of the academic research staff.

Students are also often free to tailor their module choices to their future research interests or career aspirations. Hence there is no 'typical' syllabus at this level.

We have opted to present a range of modules that will be most useful in quantitative finance albeit with an emphasis on stochastic calculus and stochastic processes. By choosing such modules we are necessarily leaving out many useful and interesting areas of mathematics.

As with Part III of the series, at this level of mathematical sophistication there is far less freely available video material. Nearly all available self-study material will be late-stage undergraduate and early postgraduate textbooks along with sporadic lecture notes found at university course module websites.

Year 4

Here is the course list for Year 4:

Brownian Motion

Brownian Motion (or, the Wiener Process) is an indispensible concept within quantitative finance. It is widely used as a model for stock price path evolution. Hence gaining familiarity with its mathematical properties is an essential prerequisite for more advanced study in derivatives pricing.

Many Brownian Motion courses will explore the mathematical properties of Brownian Motion, introducing its continuous, but non-smooth fractal nature. You will also likely be given a rigourous treatment of the Ito Calculus, which allows computation on Brownian Motion functions. Further topics may include an exposition of Stochastic Differential Equations (SDE), which are also an essential component of derivatives pricing.

While Brownian Motion an indispensible tool in mathematical finance it is also a necessary prerequisite for study in any Stochastic Calculus modules. In addition Brownian Motion is strongly linked to the concept of a Gaussian Process, which is a widely utilised Machine Learning model in both academia and industry.

Study Material

There will likely be significant overlap between a course on Brownian Motion and other courses in Stochastic Analysis or Stochastic Calculus for Finance. Hence study recommendations will also overlap to some extent.

Stochastic Analysis

Stochastic Analysis (or Stochastic Calculus) is the theory that underpins modern mathematical finance. It provides a natural framework for carrying out derivatives pricing. While quantitative finance is one of the main application areas of stochastic analysis, it also has a rich research history in the fields of pure mathematics, theoretical physics and engineering.

Stochastic Analysis usually follows on directly from study in Brownian Motion. As a course its primary concern is often to introduce Stochastic Differential Equations and the Ito Calculus that helps solve them.

Most Stochastic Analysis courses introduce various types of SDE, along with discussion on filtering problems (of which the Kalman Filter is a famous example) and diffusions.

Stochastic Analysis is a useful prerequisite for studying the field of Stochastic Optimal Control, which finds many applications in engineering and mathematical finance.

In order to get started with Stochastic Analysis a reasonable background in measure-theoretic probability theory is usually necessary. We discussed Measure Theory in our module outline within Part 3 of this series.

Study Material

As most Stochastic Analysis courses focus on SDE and the Ito Calculus we recommend self-study textbooks in those areas. We have provided a theoretical recommendation as well as a numerical recommendation. The latter is useful for those who prefer to 'code up' solutions in order to aid their intuition:

Stochastic Calculus for Finance

Stochatic Calulus for Finance is another course widely found at mathematics and statistics departments within the fourth year syllabus. It is usually shared by those taking a Masters in Financial Engineering.

The intent with these types of courses is to bring together the theoretical study of Brownian Motion, Stochastic Differential Equations and Ito Calculus in order to use it to price derivative contracts, in continuous-time.

One of the main concepts discussed in these courses is risk neutral pricing, which then leads into the Black-Scholes model. This then motivates the concept of the 'Greeks', or sensitivities of the option price to various factors, along with delta-hedging.

Depending upon the length of the course, other topics such as optimal stopping, American options and interest rate models may also be discussed.

For those who are interested in self-study of a mathematics degree this is perhaps one of the most crucial modules if the goal is to be a practising derivatives pricing quant within industry.

Study Material

There are a wealth of textbooks on the stochastic calculus necessary for derivatives pricing. The most widely recommended introductory text with a reasonable level of mathematical rigour is by Shreve:

Stochastic Optimal Control

Stochastic Optimal Control is a very useful and interesting interdisciplinary field across mathematics, physics, engineering and finance. It provides the underlying theory to certain control problems, which deal with the design of models to control continously operating dynamical systems that are subject to 'noise' in the current estimates of system state.

A particularly famous recent example is the problem of autonomously landing a rocket booster back at the launch site or on an autonomous drone ship, which has been implemented and demonstrated by the SpaceX space technology firm. Other day-to-day examples include cruise control in automobiles or the hovering mechanisms in quadcopter drones.

Within finance stochastic optimal control is used for optimal asset allocation decisions, as well as for pricing of American option contracts. It is also closely related to the machine learning field of Reinforcement Learning, itself famous for its recent successes in beating humans at both the ancient game of Go and real-time strategy video games.

The main topics discussed in theoretical stochastic optimal control courses are optimal stopping along with the Bellman Equation and the Hamilton-Jacobi-Bellman equation.

The course requires a good grounding in stochastic processes, SDE and Ito Calculus. The previous modules described thus far in the article are all suitable prerequisites for the basic theory.

Study Material

Statistical Modeling

Many mathematics and statistics departments provide some form of advanced statistical modeling course, often applied in nature, that goes beyond linear statistical models. Such courses often focus on the theory and application of Generalised Linear Models (GLM), which are an extension of standard linear regression to problems where the expectation of the response variable is not simply given by a linear combination of predictors.

The usual mechanism for teaching such approaches combines theoretical lectures with laboratory sessions using statistical computing packages such as R or Python with the appropriate modeling library. R and Python are both widely utilised within the quantitative finance industry, particularly in systematic hedge funds. The applied approach outlined here is thus often appropriate for those wishing to become quantitative researchers.

Machine Learning (and Deep Learning in particular) have fueled the recent rise of Data Science. It is now an extremely popular career choice and often provides lucrative remuneration. Systematic hedge funds are expanding their hiring processes to include data scientists. However, it should be emphasised that much of what is now termed 'Machine Learning' in industry is really the application of classical statistical methods to larger, alternative datasets.

Many quant funds will still expect a rigourous statistical background grounded in classical statistics and the language of probability from their prospective candidates. In essence, simply being familiar with modern Machine Learning algorithms without an underlying theoretical base in statistics is insufficient. Hence the concept of a classical Statistical Modeling course remains more relevant than ever.

Study Material

There are a large number of textbooks on the topic of statistical modeling and GLMs:

Statistical Machine Learning

Machine learning has provided significant leaps in performance on many challenging tasks including image recognition, natural language processing and even video games competitions, that until relatively recently were deemed unsolveable via a computational approach. Machine learning techniques have also seen widespread adoption within the quantitative finance and systematic trading communities. Such techniques are now fundamentally embedded within many day-to-day interactions within society. Hence their importance as a topic of study continues to grow.

Statistical machine learning is a more theoretical approach to studying the common algorithms that underpin today's AI related functionality. The subject emphasises a rigourous probabilistic framework to developing machine learning models grounded in the language of statistics. This 'bottom up' methodology is in contrast to a 'top down' code first approach, which often seeks to apply algorithms to problems without a need for a deep theoretical insight.

Statistical machine learning courses introduce the field with the classical linear regression and logistic regression statistical tools. The importance of regularisation and resampling methodologies are highlighted. More complex models such as Support Vector Machines, Gaussian Processes and tree-based ensembles (such as Random Forests or Gradient Boosted Trees) are often presented in the latter stages of many courses.

Note that traditional statistical machine learning courses do not often cover artificial neural network (ANN) based approaches, in particular the extremely popular field of Deep Learning. However the insight derived from studying these 'traditional' or 'shallow' methods is extremely useful and provides a solid grounding when attempting to apply more sophisticated models.

Our advice is not to ignore these models due to any perceived lack of effectiness compared to neural network based approaches. Linear models are still extremely effective in certain cases and having a deep insight into their strengths and weaknesses is a valuable skill, particularly for those interested in a quantitative research career.

Study Material

Machine learning is an extremely popular area. There are a variety of textbooks pitched at the beginner, practitioner and researcher. On a mathematically oriented course as is described within these articles the following books (and MOOC) are well recommended:

Markov Chains

Markov Chains (discrete time) and Markov Processes (continous time) are stochastic processes, which describe a sequence of events, where the probability of a subsequent event occuring is only dependent upon the current state. This aspect of Markov Processes is known as the Markov Property.

Markov Processes are a well-studied tool in mathematical finance, since they form the basis of the Markov Chain Monte Carlo (MCMC) algorithm, which underpins computational Bayesian statistics. Another application for Markov Chains is in estimating parameters in stochastic volatility models. We have previously discussed Hidden Markov Models as a further application.

Outside of quantitative finance they find uses in modeling queuing processes and disease epidemics. They also form the basis of Markov Decision Processes, which are heavily utilised in the field of Reinforcement Learning. Hence it can be seen that Markov Processes are able to model many real-world phenomena and are consequently worth the investment of self-study.

Study Material

The following are two introductory textbooks pitched at the upper undergraduate level for those who have worked through some of the aforementioned courses above:

High Performance Computing

High Performance Computing (HPC) is the study of efficient parallelisation of computational tasks requiring significant CPU or memory beyond that provided by a typical laptop or desktop workstation. HPC is prevalent in quantitative finance within the realms of derivatives pricing and trading simulation.

'Traditional' HPC often generates any data required as part of a simulation and is thus CPU-bound and memory-bound. To get around this limitation many codes have been parallelised using libraries such as OpenMP or MPI. They are often designed to run on specialist supercomputing hardware. Typical examples in academia and industry include climate models, computational fluid dynamics and molecular dynamics. In quantitative finance the canonical example is generating samples from probability distributions to use in options pricing models.

In contrast, modern HPC now includes workloads that fit into the map-reduce paradigm. This often involves carrying out processing and aggregation analytics over a large range of data. In these applications the processing of large datasets can be split up into various distributed chunks and re-assembled once all the individual processes have completed. Typical examples include processing customer analytics withinin technology startups or recommendation engines for ecommerce. In systematic hedge funds distributed computing is often used to carry out parametrised backtest simulations over large datasets.

Most, if not all, modern quantitative hedge funds will be using some form of distributed computing for their workloads. Hence an understanding of modern tools, such as Hadoop and Spark, along with the performance characteristics of accessing large relational and NoSQL datastores is a valuable skill, both for the quantitative researcher and the quantitative developer.

A further modern approach to HPC is the use of Graphics Processing Units (GPU) to enable highly parallel workloads. GPUs have found widespread adoption as the workhorse for Deep Learning models due to their ability to handle numerical linear algebra problems in an optimal manner. GPUs are also utilised to generate probability samples in Monte Carlo based option pricing approaches.

Study Material

There are a few texts on 'traditional' HPC, which largely step through the process of using communications libraries such as OpenMP and MPI. However, of more relevance to the practising quant are guides on modern tools for distributed computing. Hence the focus here is less on academic texts and more on practitioner texts that provide step-by-step examples in modern distributed software tools.

Next Steps

The courses listed above are designed to prepare you for postgraduate study via an MFE or to have sufficient day-to-day knowledge to work in high-end quantitative finance. We have chosen those that will be useful for both quantitative research and quantitative development, since there is often a lot of overlap between roles in modern quant funds.

As we stated in Part III of the series some of the courses might feel rather abstract, with their relevance to day-to-day quant work not apparent at first glance. However, the QuantStart team can personally attest that many of the concepts outlined above 'find their way' into day-to-day quantitative finance practice. We will reiterate that learning these principles will provide you with a definitive advantage when going for interviews.

The next article in the series will concentrate on how to self-study for an MFE, without heading to graduate school, using the modules above as a springboard into modern mathematical finance theory.

Related Articles