logoalt Hacker News

siruwastakentoday at 12:55 PM7 repliesview on HN

Could somebody provide a bit of context on what exactly this is? It seems interesting, but I have no idea what I am looking at.


Replies

Zhyltoday at 2:07 PM

To complement leethomps answer, combinatory logic is a branch of Mathematics that was started in the 1920s by a mathematician called Moses Shönfinkel which deals with "functions that do stuff and return other functions".

This was developed by some names that may be more familiar (Haskell Curry, Alan Turing, Kurt Gödel, Bertrand Russell). It was proved to be identical to both the lambda calculus and the Turing machine and became the basis for modern computing.

What we see here are some of those key building blocks that were studied in the 20s and 30s and have been now applied to modern programming languages.

Functional languages use them a lot because you can express a lot of things as just combinations and compositions of other functions. Array languages often take this to an extreme by expressing complex numeric algorithms with only a few symbols.

What you see above is the logic/processing order of how those functions fit together. For example you can express a mean as something like `(+/#)` - a 5 letter anonymous function that can be applied to an array - because of all the applications and combinations being implicit in the structure of the language, as denoted in the link.

joshmoody24today at 6:42 PM

Lambda calculus is a model of computation. A pretty lightweight model that can still do everything that other programming languages can. Combinators are an even simpler model of computation that is still equally powerful. That simplicity / power ratio is what makes them cool.

A while back I built all the way up to FizzBuzz from just S and K combinators. Took days of doing all the math by hand, lol.

Here's my write up of doing that. I did it in JavaScript because most combinator articles online were prohibitively academic for my layman mind. https://joshmoody.org/blog/programming-with-less-than-nothin...

leethomptoday at 1:29 PM

Many primitives in array languages match the behaviour of certain combinators in combinatory logic. The page shows (left to right) the symbol for a certain combinator, its effective operation in APL syntax where x and y are left and right arguments (APL operators are either infix or single-parameter prefix) and F and G are similarly left and right function arguments, the 'bird' is a sort of colloquial name for a particular combinator, 'TinyAPL' is the operator that matches the combinator in the author's APL implementation, and the diagram is a way of explaining how the combinator works visually

BQN, another array language has a page of documentation describing the same concept for their language with a bit more explanation for the combinator newcomer: https://mlochbaum.github.io/BQN/tutorial/combinator.html

show 2 replies
observationisttoday at 2:43 PM

Combinators are math, and a little like Lisp - building functions from primitives and operations with the ability to apply them, where even the notion of variables are functions - functions all the way down.

The y combinator is this: λf.(λx.x x)(λx.f(x x))

Lambda diagrams get you visualizations like this:

https://tromp.github.io/cl/diagrams.html

When considering logic and functions, when thinking in the space of combinators, you can ask questions like "What is Plus times Plus" and have a sensible result. https://www.youtube.com/watch?v=RcVA8Nj6HEo

Combinators are awesome.

The site linked by OP is a specific collection of combinators with bird names, riffing on the "To Mock a Mockingbird" puzzle book and subsequent meme of giving combinators bird names.

show 1 reply
laszlokortetoday at 3:55 PM

Based on other existing material on the topic (like the excellent code_report youtube channel) I once wrote an introduction to combinators and lambda calculus targetted at javascript developers (mostly targetted at my younger self) [1]

In short a combinator is a pure function that accesses only identifiers that are provided as arguments.

Length(x,y) { sqrt(xx + yy) } is not a combinator because it relies on global definitions for plus, times and sqrt.

But foo(x, y, b, u, v) { v(b(u(x), u(y))) } is a combinator because it only composes functions that are given as arguments.

Foo(3,5,+,square,sqrt) would result in the same value as length(3,5) so foo can be regarded as capturing the compositional structure of the euclidean distance calculation.

[1]: https://static.laszlokorte.de/combinators/

jb1991today at 2:55 PM

This site is actually named after one of the most popular and widely used Combinators in lisp.

show 2 replies
momentoftoptoday at 5:25 PM

Combinators were an attempt to do logic (and computation falls out) without having to mess around with variables and variable substitution, which is annoying and inelegant because you have to worry about syntax issues like variable capture. Combinators were an attempt to do logic with a much simpler and cleaner syntax.

So combinator logic starts with a really simple language, based on a small alphabet of primitive combinators. You can see a bunch listed on the webpage:

   I, K, W, C, B, Q, ....
These are the primitive bits of syntax. The only other feature in the language is the ability to apply one combinator to another combinator. You write an application of a combinator "x" to another combinator "y" as "x y", and for convenience, you treat these applications as left associative, so "x y z" means "(x y) z": that is, first apply y to x, and then apply z to the resulting combinator.

Two typical combinators are K and S, with which you can form more complex combinators like

   K K
   S K
   K K K
   K (K K)
   K (S K)
...

Combinators generally come with simplification rules, and the ones for K and S are:

   K x y = x
   S f g x = f x (g x)
With these, we can start doing interesting reductions like:

   S K K x = K x (K x) = x
Now the weird fact: we're suddenly Turing Complete. It turns out that every possible computation is expressible just by building a big combinator out of K and S and applying those two simplification rules. No other machinery is needed.

K and S are not the only combinators with this property, and others form an adequate Turing Complete basis.

If you've heard of the Curry-Howard correspondence (Curry was responsible for combinatory logic), then combinators provide probably the simplest example of it, since if you give combinators types, you realise you are working with what's called a "Hilbert style" deduction system for propositional logic, which is the simplest sort of formal logical system. Indeed:

   1. Hilbert's first two axioms for his version of the calculus are exactly the types for K and S above
   2. K and S are invocations of these axioms
   3. Application is modus ponens
   4. The combinator S K K above corresponds to the proof that p → p.
   5. The simplification of S K K x is proof normalisation (if you ever see the proof S K K x for some proof x, you should simplify it to just the proof x).