## Tuesday, September 20, 2011

### Types of data

A fundamental distinction is the distinction between space and time. This leads us to the distinction between extensional definitions and intensional definitions, extensional definitions represent every matching object in space and intensional definitions describe some temporal process for generating objects.

The field of mathematics is focused on the study of a particular form of temporal process: the pure function. The only noticeable side effect of a pure function is the creation of some new value.

## Tuesday, September 6, 2011

### Antireduce

Antireduce is an useful function in sequence pattern recognition. Antireduce takes a binary operation (usually the inverse of some other binary operation) and returns the result of applying it across a sequence of values:

(defn antireduce
[op coll]

(map
(fn [i]
(if (= i 0)
(first coll)
(op (nth coll i) (nth coll (dec i)))))
(range (count coll))))


For example, we can apply this function to the triangular-numbers:

(= (take 4 (iterate (partial antireduce -)
[0 1 3 6 10 15 21 28 36 45])

(0 1 3 6 10 15 21 28 36 45)
(0 1 2 3 4 5 6 7 8 9)
(0 1 1 1 1 1 1 1 1 1)
(0 1 0 0 0 0 0 0 0 0))


Alternatively, we can use this on the factorial function by using the higher inverse-hyper-operator, division.

(= (take 2 (iterate (partial antireduce /)
[1 1 2 6 24 120 720 5040 40320 362880]))

(1 1 2 6 24 120 720 5040 40320 362880)
(1 1 2 3 4 5 6 7 8 9))


The antireduce function itself can be reversed:

(defn reducify
[op coll]

(map
(fn [i]
(reduce op (map (partial nth coll) (range (inc i)))))
(range (count coll))))


This function should allow us to automatically recognise some basically sequences.

## Sunday, September 4, 2011

### PI is wrong

At the core of any Lispers understanding is the fact that mathematical notation sucks. We recognise just how painful it is to have to parse expressions using a horrid order of operations table like PEMDAS. We reject the use of such syntax in program source code, which is written the luxury of a keyboard, but we are tolerant of such vices when writing with pencil and paper.

As we have a skeptical outlook of mathematical notation from the get-go, you will be hard-pressed to find one of us that isn't also receptive of new mathematical symbols like Tau:

$$\tau = 2\pi = 8 \sum _{n=0}^{\infty} \frac{(-1)^n}{2n+1}$$

I am one of those people who has constantly failed to make sense of the mind boggling concept of pi-radians, so I when I read Micheal Hartl's "Tau Manifesto" I was a quick convert.

The reason that Tau makes more sense then Pi in most cases is that Tau is based upon the distance from the center to any point in the circle, or the radius, rather then the diameter which is relatively rarely used.

$$\tau = {C \over r}$$
$$\pi = {C \over D}$$

Tau makes trigonometry much easier, because tau is the periodicity of the sin and cosine functions, and Euler's identity becomes $e^{\tau i} = 1$ which is far more elegant. Even the circular area makes more sense with tau, as it becomes tau times the antiderivative of r:

$$A = \frac{\tau r^2}{2}$$

Regardless of these considerations mathematicians still overuse pi, which is another example of a flaw in our notation systems. However, I would still say that most mathematical theorems are factually correct, despite our flawed attempts at expressing them.

## Friday, September 2, 2011

### On the Lisp machines

The Lisp machines had a hardware support for automating memory management, persistence, and many other tasks we could do with automating today. The way they achieved this was to use a single address space tagging architecture.

However, the real advantage of the Lisp machines is their use of Lisp all the way down. You could trace anything all the way down to the device drivers, and the system was based upon Lisp you could effectively automate anything.