logoalt Hacker News

What I Always Wanted to Know about Second Class Values

24 pointsby todsacerdotitoday at 4:49 PM12 commentsview on HN

Comments

kazinatortoday at 6:25 PM

I feel that this work, and that other one it references which proposes modeling first and second class values with type, are simply ignorant of the massive amount of research that had been done in this area in decades past.

I'm actually astonished you can still publish anything on this stuff: papers that talk about upward funargs and show trivial Lisp lambda code, like it's 1973.

The stack/heap dichotomy dictating second class vs first class is wrongheaded; we already have results which show that the distinction between stack and heap is artificial and phony.

For instance oh, Richard Stallman's 1980 work, the "Phantom Stacks" paper. The following two italicized paragraphs are an excerpt:

The key to reducing the impact of a stack on the complexity of the system is to realize that a stack really is not a sort of object that lives in memory, but a way of allocating memory among other objects. It need not have any existence, except in the "mind" of some part of the system which is allocating memory, for some length of time. If the data which is stored in the stack looks like ordinary Lisp objects, then most of the Lisp system will be able to deal with it properly without having to be aware that there is a stack.

One consequence of making the stack not recognizable as a stack is that the garbage collector won't know it is a stack. It will compactify, reorder or reallocate it willy nilly. But the stack's contents, valid accessible Lisp objects, will be preserved in meaning by the garbage collection, just like all such Lisp objects. So no harm is done, as long as we realize that the stack isn't a stack any more. The area of memory which used to be a stack is now part of the heap.

Chicken Scheme, famously, implements an idea from Henry Baker of bump allocating all objects on the stack. All function calls create new frames, and so do function returns: function returns are actually compiled into calls to the continuation and therefore are new function calls with new frames on the stack. When the stack reaches a certain limit, it is rolled back. All objects on it that are still reachable (including lambda environments) are moved to the heap at that moment.

So according to the present paper, objects must be transparently switching from the second class to the first class --- and that is incompatible with the idea of treating them as different types.

show 3 replies
moringtoday at 7:02 PM

Sorry for being ignorant of the basics, but how does the following work?

> However, seg-ments of code that are provably free of garbage collection > have deterministic timing and can satisfy hard timing con-straints, as they > are certainly not interrupted by garbage collection.

You are only ever "certainly not interrupted" if you turn off interrupts, which requires a high level of privileges. And not being interrupted still does not mean you have uncontended access to main memory or shared caches, which is a relevant factor for hard real-time. Nor do you have uncontended access to execution facilities (e.g. multipliers or floating-point ALUs), but at least for those you might be able to find an upper bound even for contended access.

show 2 replies
stmwtoday at 7:20 PM

Some may argue that the real problem here is the unstated assumption of wanting to have a garbage collector in the first place.

show 1 reply