←back to thread

1401 points alankay | 5 comments | | HN request time: 0.001s | source

This request originated via recent discussions on HN, and the forming of HARC! at YC Research. I'll be around for most of the day today (though the early evening).
Show context
wdanilo ◴[] No.11941656[source]
Hi Alan! I've got some assumptions regarding the upcoming big paradigm shift (and I believe it will happen sooner than later):

1. focus on data processing rather than imperative way of thinking (esp. functional programming)

2. abstraction over parallelism and distributed systems

3. interactive collaboration between developers

4. development accessible to a much broader audience, especially to domain experts, without sacrificing power users

In fact the startup I'm working in aims exactly in this direction. We have created a purely functional visual<->textual language Luna ( http://www.luna-lang.org ).

By visual<->textual I mean that you can always switch between code, graph and vice versa.

What do you think about these assumptions?

replies(2): >>11942789 #>>11945722 #
alankay ◴[] No.11945722[source]
What if "data" is a really bad idea?
replies(3): >>11945869 #>>11956981 #>>11984719 #
richhickey ◴[] No.11945869[source]
Data like that sentence? Or all of the other sentences in this chat? I find 'data' hard to consider a bad idea in and of itself, i.e. if data == information, records of things known/uttered at a point in time. Could you talk more about data being a bad idea?
replies(2): >>11946532 #>>11948698 #
alankay ◴[] No.11946532[source]
What is "data" without an interpreter (and when we send "data" somewhere, how can we send it so its meaning is preserved?)
replies(3): >>11946764 #>>11957966 #>>11959640 #
richhickey ◴[] No.11946764[source]
Data without an interpreter is certainly subject to (multiple) interpretation :) For instance, the implications of your sentence weren't clear to me, in spite of it being in English (evidently, not indicated otherwise). Some metadata indicated to me that you said it (should I trust that?), and when. But these seem to be questions of quality of representation/conveyance/provenance (agreed, important) rather than critiques of data as an idea. Yes, there is a notion of sufficiency ('42' isn't data).

Data is an old and fundamental idea. Machine interpretation of un- or under-structured data is fueling a ton of utility for society. None of the inputs to our sensory systems are accompanied by explanations of their meaning. Data - something given, seems the raw material of pretty much everything else interesting, and interpreters are secondary, and perhaps essentially, varied.

replies(2): >>11946935 #>>11946989 #
alankay ◴[] No.11946935[source]
There are lots of "old and fundamental" ideas that are not good anymore, if they ever were.

The point here is that you were able to find the interpreter of the sentence and ask a question, but the two were still separated. For important negotiations we don't send telegrams, we send ambassadors.

This is what objects are all about, and it continues to be amazing to me that the real necessities and practical necessities are still not at all understood. Bundling an interpreter for messages doesn't prevent the message from being submitted for other possible interpretations, but there simply has to be a process that can extract signal from noise.

This is particularly germane to your last paragraph. Please think especially hard about what you are taking for granted in your last sentence.

replies(5): >>11947046 #>>11947082 #>>11947341 #>>11947809 #>>11958689 #
ontouchstart ◴[] No.11947046[source]
I think object is a very powerful idea to wrap "local" context. But in a network (communication) environment, it is still challenging to handle "remote" context with object. That is why we have APIs and serialization/deserialization overhead.

In the ideal homogeneous world of smalltalk, it is a less issue. But if you want a Windows machine to talk to a Unix, the remote context becomes an issue.

In principle we can send a Windows VM along with the message from Windows and a Unix VM (docker?) with a message from Unix, if that is a solution.

replies(2): >>11947113 #>>11947120 #
ontouchstart ◴[] No.11947113[source]
Alan, what is your view on Olive Executable Archive ?https://olivearchive.org/
replies(2): >>11953523 #>>11958697 #
alankay ◴[] No.11953523[source]
Their larger goals are important.
replies(1): >>11954872 #
ontouchstart ◴[] No.11954872[source]
Do you think they are on the right path to their larger goals?
replies(1): >>11954982 #
alankay1 ◴[] No.11954982[source]
I think for so many important cases, this is almost the only way to do it. The problems were caused by short-sighted vendors and programmers getting locked into particular computers and OS software.

For contrast, one could look at a much more compact way to do this that -- with more foresight -- was used at Parc, not just for the future, but to deal gracefully with the many kinds of computers we designed and built there.

Elsewhere in this AMA I mentioned an example of this: a resurrected Smalltalk image from 1978 (off a disk pack that Xerox had thrown away) that was quite easy to bring back to life because it was already virtualized "for eternity").

This is another example of "trying to think about scaling" -- in this case temporally -- when building systems ....

The idea was that you could make a universal computer in software that would be smaller than almost any media made in it, so ...

replies(1): >>11955225 #
ontouchstart ◴[] No.11955225[source]
I agree that the "image" idea is more powerful than the "data" idea.

However since PC revolution, the mainstream seemed to take on the "data" path for whatever technical or non-technical reasons.

How do you envision the "coming back" of image path via either bypassing the data path or merging with it in a not so faraway future?

replies(1): >>11955374 #
alankay ◴[] No.11955374{3}[source]
Over all of history, there is no accounting for what "the mainstream" decides to believe and do. Many people (wrongly) think that "Darwinian processes" optimize, but any biologist will point out that they only "tend to fit to the environment". So if your environment is weak or uninteresting ...

This also obtains for "thinking" and it took a long time for humans to even imagine thinking processes that could be stronger than cultural ones.

We've only had them for a few hundred years (with a few interesting blips in the past), and they are most definitely not "mainstream".

Good ideas usually take a while to have and to develop -- so the when the mainstream has a big enough disaster to make it think about change rather than more epicycles, it will still not allocate enough time for a really good change.

At Parc, the inventions that made it out pretty unscathed were the ones for which there was really no alternative and/or no one was already doing: Ethernet, GUI, parts of the Internet, Laser Printer, etc.

The programming ideas on the other hand were -- I'll claim -- quite a bit better, but (a) most people thought they already knew how to program and (b) Intel, Motorola thought they already knew how to design CPUs, and were not interested in making the 16 bit microcoded processors that would allow the much higher level languages at Parc to run well in the 80s.

replies(1): >>11956539 #
1. ontouchstart ◴[] No.11956539{4}[source]
It seems that barriers to entry in hardware innovation are getting higher and higher due to high risk industrial efforts. In the meantime barriers to entry in software are getting lower and lower due to improvement of toolings in both software and hardware.

On the other hand due to the exponential growth of software dependency, "bad ideas" in software development are getting harder and harder to remove and the social cost of "green field" software innovation is also getting higher and higher.

How do we solve these issues in the coming future?

replies(1): >>11956973 #
2. alankay ◴[] No.11956973[source]
I don't know.

But e.g. the possibilities for "parametric" parallel computing solutions (via FPGAs and other configurable HW) have not even been scratched (too many people trying to do either nothing of just conventional stuff).

Some of the FPGA modules (like the BEE3) will slip into a Blades slot, etc.

Similarly, there is nothing to prevent new SW from being done in non-dependent ways (meaning the initial dependencies to hook up into the current world can be organized to be gradually removeable, and the new stuff need not have the same kind of crippling dependencies).

For example, a lot can be done -- especially in a learning curve -- if e.g. a subset of Javascript in a browser (etc) can really be treated as a "fast enough piece of hardware" (of not great design) -- and just "not touch it with human hands". (This is awful in a way, but it's really a question of "really not writing 'machine code' ").

Part of this is to admit to the box, but not accept that the box is inescapable.

replies(1): >>11958076 #
3. ontouchstart ◴[] No.11958076[source]
Thank you Alan for your deep wisdom and crystal vision.

It is the best online conversation I have ever experienced.

It also reminded me inspiring conversations with Jerome Bruner at his New York City apartment 15 years ago. (I was working on some project with his wife's NYU social psychology group at the time.) As a Physics Ph.D. student, I never imaged I could become so interested in Internet and education in the spirit of Licklider and Doug Engelbart.

謝謝。

replies(1): >>11958256 #
4. alankay ◴[] No.11958256{3}[source]
You probably know that our mutual friend and mentor Jerry Bruner died peacefully in his sleep a few weeks ago at the age of 100, and with much of his joie de vivre beautifully still with him. There will never be another Jerry.
replies(1): >>11958342 #
5. ontouchstart ◴[] No.11958342{4}[source]
Indeed, there will never be another Jerry.

RIP

https://en.wikipedia.org/wiki/Jerome_Bruner

http://www.law.nyu.edu/news/in-memoriam-jerome-bruner