Just a few weeks after I bemoaned in these pages the lack of a worthy successor to Visual Basic 6.0, along comes May 1, 2014, the 50th anniversary of the Basic computer language itself.
The idea of Basic’s originators, John Kemeny and Thomas Kurtz, was that college students not trained in mathematics could now program computers, thanks to the simple nature and reduced instruction set of the original 1964 version of Basic.
When microprocessors appeared in the 1970s, Bill Gates wrote a Basic interpreter for the Intel processor, and Steve Wozniac wrote one for his 6502 processor-based Apple I and II computers. As a result, a whole generation of future hackers of America grew up typing in listings of Basic code from magazines such as Creative Computing, running primitive computer games, balancing checkbooks, storing recipes, etc.
Even so, the great liberal dream of people writing their own software only partly came to pass.
Back in 1980, when Yours Truly was working for the TV show Sesame Street, computing pioneer Alan Kay came to visit. Kay was just finishing up his Fellowship at Xerox PARC during its glorious years, when Xerox had the entire future of computing in its hands: graphical user interfaces, the mouse, networking, laser printers, you name it.
Of course, Xerox didn’t do much with their home-grown embarrassment of riches, but it did allow the folks at Apple to see it all, which led to the Mac. It also indirectly led to Microsoft Windows.
Among other things, Kay invented overlapping windows and was a Fellow in a number of other companies: Apple, Disney, and HP.
In any case, at the time Kay was engrossed in a computer language called Smalltalk. (I impressed him with my knowledge of trivia that included the fact that Smalltalk was originally developed to run artificial intelligence expert systems.)
Kay opined that, in an experimental project to teach people to write their own word processing program using powerful software development tools, researchers found that most people couldn’t do it, even when you showed them how.
At this point we could digress into how the human population appears divided into technically-proficient and non-technical types — starting with the British scientist and novelist C. P. Snow’s observation that the intellectual life of the whole of western society was divided into two cultures, the sciences and the humanities — and working our way down to amusing anecdotes about whiz-kid hackers versus technically clueless fine arts and English majors.
Part of the problem is that, indeed, some people’s brains are wired to be analytic and mathematical and others are more generally holistic and conceptual. On the other hand, “algorithmic thinking,” the kind that generates computer programs, is not as closely allied with mathematics as one might think.
Computers are, after all, linguistic machines. You tell them what to do in a language, be it Basic, Fortran, C#, C++, Java, PHP, Python, Ruby, Lisp, or what-not. One can write pages and pages of computer code that has nothing to do with the kind of medium-level mathematics that befuddles college freshmen majoring in the liberal arts/humanities. That’s why I’ve always been amused that university computer science departments are typically associated with math departments.
Moreover, “powerful development tools” are powerful only for programmers. Attempts at producing software development environments for nonprogrammers usually result in highly graphical, “cartoony” tools of limited utility.
This started back in the days of computer telephony, when people wanted to quickly produce interactive voice response (IVR) systems for incoming callers to a business or call center (“press one for customer service, press two for the accounting department,” etc.), or they needed support for specific applications, such as connecting voice response units to mainframe databases (banks, credit card companies), voice messaging systems, audiotext systems, and so forth.
Soon there was an explosion of “applications generators,” self-contained environments that nonprogrammers could use to build applications. The resulting code was often not as efficient as that generated by a programmer writing from scratch, and the general-purpose “appgens” were not very flexible, and thus limited in what they could produce.
The key to the problem, I believe, is mapping a computer language to both the job at hand and the conceptual preferences of the nonprogrammer. Most programmers complain that when they take over a project started by another programmer, the other programmer has written routines that extend the language, essentially creating their own mini-language, which has to be deciphered. Programmers preach that there should be one “correct” way of doing things in any language.
I am of the opposite opinion, having declared back in the 1980s that every application should have its own “custom language” with a vocabulary that maps closely to the job at hand. Although maddening for “real” programmers, it enables nonprogrammers to understand what’s going on quite clearly, since the computer now literally “speaks their language.”
Richard Grigonis is an internationally known technology editor and writer. He was executive editor of Technology Management Corporation’s IP Communications Group of magazines from 2006 to 2009. The author of five books on computers and telecom, including the highly influential Computer Telephony Encyclopedia (2000), he was the chief technical editor of Harry Newton's Computer Telephony magazine (later retitled Communications Convergence after its acquisition by Miller Freeman/CMP Media) from its first year of operation in 1994 until 2003. Read more reports from Richard Grigonis — Click Here Now
© 2014 Newsmax. All rights reserved.