Though a lot has happened in the past 30 years, how do you feel computer programming and engineering has changed as a discipline? Is there still the space and capacity to innovate in programming languages as there was in the 1970s?
There is certainly considerable room for improvement! The taste for it and the taste for inventing the improvements doesn’t seem to be there (or at least as strongly as it was in the 60s). Academia in particular seems to have gotten very incremental and fad oriented, and a variety of factors (including non-visionary funding) make it very difficult for a professor and a few students to have big ideas and be able to make them. This is a huge problem.
The Xerox Palo Alto Research Centre (PARC) seems to have been a bit of beehive of development and innovation in the 1970s and 80s, and formed the basis of modern computers as we know them today. Have you seen the ICT industry change significantly in terms of a culture of innovation and development?
It is fair to characterize much of what has happened since 1980 as “pretty conservative commercialization of some of the PARC inventions”. Part of the slowdown in new invention can be ascribed to the big negative changes in government funding, which in the 60s especially was able to fund high-risk, high-reward research.
I don’t see anything like PARC today in any country, company or university. There are good people around from young to old, but both the funding and academic organizations are much more incremental and conservative today.
Is there a chance at revival of those innovative institutions of the 60s? Are we too complacent towards innovation?
One part of a “revival” could be done by simply adding back a category of funding and process that was used by ARPA-IPTO in the 60s (and other great funders such as the Office of Naval Research). Basically, “fund people, not projects”, “milestones, rather than deadlines”, “visions rather than goals”. The “people not projects” part meant “super top people”, and this limited the number who could be funded (and hence also kept the funding budget relatively low).
The two dozen or so scientists who went to Xerox PARC had their PhDs funded by ARPA in the 60s, and so we were the second generation of the “personal computing and pervasive networks” vision. In today’s dollars these two dozen (plus staff support and equipment, which was more expensive back then) would cost less than $15 million dollars per year. So this would be easy for any large company or government funder to come up with.
There are several reasons why they haven’t done it. I think in no small part that today’s funders would much rather feel very much in control of mediocre processes that will produce results (however mediocre) rather than being out of control with respect to processes that are very high risk and have no up front guarantees or promises (except for “best effort”).
The other part of this kind of revival has to do with the longitudinal dimensions. Basically the difference between hunting and gathering, and agriculture. The really hard projects that can’t be solved by “big engineering” require some “growing” of new ideas and of new people. Xerox PARC really benefitted from ARPA having grown us as grad students who had “drunk the Kool-Aid” early, and had deep inner determinations to do the next step (whatever that was) to make personal computing and pervasive networking happen.
A lot of the growth dynamics has to do with processes and products that have rather slight connections with the goals. For example, the US space program was done as a big engineering project and was successful, but failed to invent space travel (and probably set space travel back by 30-50 years). However, the Congress and public would not have stood for spending a decade or more trying to invent (say) atomic powered engines that could make interplanetary travel much more feasible.
Nobody really cared about interactive computing in the 60s, and the ARPA funding for it was relatively small compared to other parts of the Department of Defense effort against the Russians. So quite a lot got done in many directions, including making the postdocs who would eventually succeed at the big vision.
Objective-C's co-creator, Brad Cox said he saw the future of computer programming in reassembling existing libraries and components, rather than completely fresh coding with each new project. Do you agree?
I think this works better in the physical world and really requires more discipline that computerists can muster right now to do it well in software. However, some better version of it is definitely part of the future.
For most things, I advocate using a dynamic language of very high level and doing a prototype from scratch in order to help clarify and debug the design of a new system – this includes extending the language to provide very expressive forms that fit what is being attempted.
We can think of this as “the meaning” of the system. The development tools should allow any needed optimizations of the meaning to be added separately so that the meaning can be used to test the optimizations (some of which will undoubtedly be adapted from libraries).
In other words, getting the design right – particularly so the actual lifecycle of what is being done can be adapted to future needs – is critical, and pasting something up from an existing library can be treacherous.
The goodness of the module system and how modules are invoked is also critical. For example, can we find the module we need without knowing its name? Do we have something like “semantic typing” so we can find what we “need” – i.e. if the sine function isn’t called “sine” can the system find it for us, etc.?