# Taking Types Too Far

Herein we check the Collatz conjecture using only Typescript’s type system.

I love Typescript, but it isn’t nearly ambitious enough. It would be vastly improved with an `--extremelyStrict`

flag enforcing that your Typescript code is free of side-effects; that is – no Javascript code is generated at all. Real programmers do all of their computation within the type system. Otherwise, they can’t be sure their program will work in production and should be duly fired.

“But you can’t even do arithmetic in the type system,” you complain. Not so.

Javascript is all well and good, but nobody *really* knows how the integers are defined. If we’re going to do Real Number Theory, we need to be sure that our numbers are formal enough to pass muster. Trust your own logic, not the Ecma committee’s. Javascript doesn’t have a natural number type, and you never know what might happen when you try to use Javascript `number`

s when you’re trying to do Precise Mathematics.

So let’s define the natural numbers.

Here we are:

```
type Natural = { prev: Natural }
```

Easy. We have a type that we’ve called Natural. It’s not a natural number yet, though. Let’s see if we can get it to comply with the Peano axioms, which is a set of axioms that formalizes the properties of these natural numbers.

We need zero:

```
type Zero = { prev: never }
```

We also need an equality relation:

```
type Equals<A extends Natural, B extends Natural> =
A extends B ?
B extends A ?
true :
false :
false
```

It’s not that useful to have a set of natural numbers that doesn’t do anything. We need a way to transform one into another – a successor function `S`

will do the trick:

```
type S<T extends Natural> = { prev: T }
```

Let’s check the first Peano axiom – namely, `Zero`

is a natural number:

```
type isZeroANaturalNumber = Zero extends Natural ? true : false
// type isZeroANaturalNumber = true
```

Nice. You can check the rest on your own time.

Let’s write out the first few numbers:

```
type One = S<Zero> // type One = { prev: Zero }
type Two = S<One>
type Three = S<Two>
type Four = S<Three>
type Five = S<Four>
type Six = S<Five> // type Six = { prev: S<S<S<S<S<Zero>>>>> }
type Seven = S<Six>
type Eight = S<Seven>
type Nine = S<Six>
// ... and so on
```

Sweet! We’ve defined the natural numbers. Everything else is just notation.

*Following along? Here is a Typescript playground with what we’ve done so far.*

This seems like a good time to introduce the Collatz conjecture. Unproven until today (and after today, despite my best efforts,) it states:

Take any natural number. If it’s even, divide it by two. If it’s odd, multiply it by three and add one. Repeat. Eventually, you’ll end up at one.

Or, more formally:

Apply `f`

enough times to any natural number, and the Collatz conjecture conjects that you’ll eventually end up at .

To check Collatz using our fancy new `Natural`

type, we need to define division by two, multiplication by three, and addition by one. To be safe, let’s just define addition, subtraction, multiplication, and division.

First, let’s define a predecessor function `P`

, which is the opposite of our successor function `S`

. `P<Zero>`

doesn’t make sense, so it is of type `never`

.

```
type P<T extends Natural> = { prev: T["prev"]["prev"] }
```

Addition is very straightforward. We can define it recursively as

Intuitively,

- Zero is the additive identity
- A + (B + 1) = (A + B) + 1

Defining addition over our `Natural`

type:

```
type Add<A extends Natural, B extends Natural> = {
0: A,
1: S<Add<A, P<B>>>
}[B extends Zero ? 0 : 1]
```

*It might be clearer to write type Add<A extends Natural, B extends Natural> = B extends Zero ? A : S<Add<A, P<B>> but we need to use this indexing hack to get around Typescript’s limitations on circular references in types.*

Subtraction isn’t too hard either:

Intuitively,

- Zero is also the subtractive(?) identity
- Zero minus a positive number is undefined (natural numbers start at zero!)
- (A + 1) - (B + 1) = A - B

```
type Subtract<A extends Natural, B extends Natural> = {
0: A,
1: A extends Zero ? never : Subtract<P<A>, P<B>>
}[B extends Zero ? 0 : 1]
```

Multiplication is almost as simple as addition:

Intuitively,

- Anything times zero is zero
- A(1 + B) = A + AB

```
type Multiply<A extends Natural, B extends Natural> = {
0: Zero,
1: Add<Multiply<A, P<B>>, A>
}[B extends Zero ? 0 : 1]
```

*Following along? Playground here*

Division is the trickiest of the bunch, but still not too bad.

Since we’re only dealing with the natural numbers, we can assign `never`

to types that represent division of a number by a non-factor.

Intuitively,

- Anything divided by zero is undefined
- Zero divided by a positive number is zero
- Each time we can subtract B from A, we add one to the result; that is:

```
type Divide<A extends Natural, B extends Natural> = {
0: never,
1: A extends Zero ? Zero : S<Divide<Subtract<P<A>, P<B>>, B>>
}[B extends Zero ? 0 : 1]
```

And we’ve defined basic arithmetic!

Since we’re really abusing the type system, if we want to write division like this without any warnings, we need to use a Typescript branch like this one that allows for deeper instantiated types, since we’re generating types that quickly exceed the default depth of 50.

Finally, it’s time we actually check the Collatz conjecture. We need one more utility, to check if a number is even.

We could do this using our `Divide`

type but it’s a bit unwieldy. Instead, we can define even numbers using the following recurrence:

Intuitively,

- Zero is even
- One is not even
- If N + 2 is even, then N is even

```
type Even<T extends Natural> = {
0: never,
1: T extends Zero ? true : T extends One ? false : Even<P<P<T>>>
}[T extends Natural ? 1 : 0]
```

Behold!

```
type Collatz<T extends Natural> = {
0: true,
1: Collatz<Even<T> extends true ?
Divide<T, Two> : // If even, divide by two
S<Multiply<T, Three>> // Otherwise, multiply by 3 and add 1
}[Equals<T, One> extends true ? 0 : 1] // One? True, otherwise Collatz<T>
```

To be fair, we can only check Collatz chains that are very short. Without configuring Typescript’s maxInstantiationDepth, we get lots of `Type instantiation is excessively deep and possibly infinite`

errors when we check massive numbers like three.

But we don’t bother ourselves with silly details like reality; after all, if we had infinite memory we’d have a perfectly good Turing machine. We could optimize our types to be less greedy about using levels we don’t need, but that’s not really the point here.

```
type c1 = Collatz<One> // true (1)
type c2 = Collatz<Two> // true (2 -> 1)
type c3 = Collatz<Three> // Error: Type instantiation is excessively deep and possibly infinite. (True if we configure the depth to be larger; (3 -> 10 -> 5 -> 16 -> 8 -> 4 -> 2 -> 1))
type c4 = Collatz<Four> // true (4 -> 2 -> 1)
type c8 = Collatz<Eight> // true (8 -> 4 -> 2 -> 1)
```

*Here’s a typescript playground with all of the code in this post. You can use the primitives here to do other cool stuff – how about a primality checker?*

Armed with tools to do arbitrary computation within the Typescript type system – go forth and make your teammates dread reviewing your code! Godspeed.

See my library, ts-json-validator if you want to use the flexibility of the type system for actual good, or this Github issue for some more fun around the Turing-completeness of the type system.