Welcome To The Dark Matter Digital Network

'2001 A Space Odyssey' 50 Years On: Q&A with Computer Scientist Stephen Wolfram

In April 1968, Stanley Kubrick and Arthur C. Clarke released "2001: A Space Odyssey." The film has delighted and confounded audiences for 50 years now, and presaged many technological and cultural developments, from computers and AI to space exploration and the search for extraterrestrial intelligence. Computer scientist Stephen Wolfram, inventor of the computation software Mathematica and online Wolfram Alpha knowledge engine, spoke with Space.com about the impact the movie had on developing technologies of the day as well as where its predictions haven't come to fruition. Space.com: Where does "2001" fit in the cannon of predictive science-fiction stories? Stephen Wolfram: Oh, you know, it's a legend that everyone refers to. Probably more on the computer side than on the space side. By the mid-1960s, everybody was assuming that AI was just around the corner. Space.com: What gave people this idea, given the state of computers in the late ꞌ60s? Wolfram: Computers were a thing you read about, but not a thing you experienced. If people had been experiencing computers in the way that everyone experiences computers today, people would have had a much more realistic view. But people's knowledge of computers mostly came from science fiction and from the general media feeling about what was going on. And the general feeling was: machines have automated manual labor; machines will automate intellectual labor. It was just taken for granted, just as, at some level, I think it was taken for granted that by the time "2001" came along there'd be space stations and people would be able to go to Jupiter. People had no idea what it meant to program a computer. For example, if you talked about bugs in computers, in software, even the professionals at that time would have looked at you kind of blankly. Like, "We don't know what you're talking about." In the "2001" movie, HAL says that HAL is incapable of error, etcetera, etcetera. And that if there's a problem it's a human problem. And that was the view. The view was computers will be perfect and the humans will be sort of imprecise. The intuition that we have today about computers — software is buggy, something like that — that's an idea of probably the late ‘80s, early ‘90s. It didn't really exist before then to the general person.

Read More: Space.com

Leave a comment