The arrival of reasonably priced microcomputers in the late 1970s and early 1980s enabled ordinary folk to actually own, run, and even write their own computer software, albeit in a scaled-down and less leading-edge manner than what computer scientists, university students, and professional programmers were doing with their giant mainframes and super minicomputers.
For many young people at the time, their initiation into computer programming involved plugging an Apple II, Atari 800, Radio Shack TRS-80, or Commodore 64 into a TV (used as a monitor), and booting up a very simple operating system and BASIC interpreter, enabling them to either type in programs written in BASIC from magazines and books, load ready-made software via a floppy disk, or else actually roll up one’s sleeves and write a computer program in BASIC oneself.
Way back in 1975, one of the first great things that Bill Gates and Paul Allen did was to write a BASIC interpreter for the then-new Altair microcomputer, boosting the machine’s usability immensely.
BASIC, the “Beginners All-Purpose Symbolic Code,” was formulated in 1964 by John Kemeny and Thomas Kurtz, who supervised a team of Dartmouth students that implemented the language on the Dartmouth Time Sharing System (DTSS).
Many an old computer booted up its BASIC interpreter as soon as you turned it on. You could just sit down, flip a switch, a cursor would appear in the upper left-hand corner of your TV screen, and you could were off and running, banging out programs to your proverbial heart’s content.
The art of programming made a giant leap with the appearance of Visual Basic, a software development package that pioneered drag-and-drop design for crafting user interfaces.
In 1987, architect/programmer Alan Cooper came up with an improved, customizable shell program for Microsoft Windows called “Tripod.” Tripod had a palette of tools with which a user could assemble “forms” and place on them — via a drag-and-drop interface — instances of the various tools, such as pushbuttons, labels, directory listboxes, etc.
In March of 1988, Cooper showed his Tripod prototype to Bill Gates, who immediately declared it as “cool” and bought exclusive rights to it. The name changed to “Ruby” and 18 months later the team of Cooper, Mark Merker, Gary Kratkin, Mike Geary, and Frank Raab sent a new and improved version to Microsoft in early 1990. This new version had the ability to add widgets dynamically, and it included a simple language engine.
Instead of shipping Ruby as a powerful new shell for Windows 3.0 as originally intended, Ruby was transformed into an event-oriented visual programming language by swapping out the existing shell language for Microsoft’s own "EB" Embedded BASIC engine designed for its bungled "Omega" desktop database project abandoned in 1990. Now code-named “Thunder,” Ruby eventually debuted as Visual Basic 1.0 at the first Windows World show held with COMDEX in Atlanta, Ga., May 20, 1991.
Visual Basic seemed miraculous. Inexperienced programmers could now cobble together Windows apps quickly, easily, and visually, building a user interface with drag-and-drop instead of coding. Prior to VB, at least 80 percent of any Windows program was devoted to the user interface, with calls to the complex Windows APIs to generate even simple screen widgets.
Granted, as a sort of “app construction kit,” it didn’t offer such sophisticated object-oriented programming niceties as polymorphism or ad-hoc polymorphism (operator overloading), but its reduced, streamlined feature set enabled 3 million amateur or lower-skilled programmers to swiftly produce tons and tons of software.
Moreover, a little secret among professional programmers is that, once they tried using Visual Basic for prototyping, they too found it to be a great Rapid Application Development (RAD) tool, and they often ended up releasing the VB version of their software as the final “production” version!
By the turn of the century, however, Microsoft decided to revamp and unify the underlying architecture of its product lines. In 2002 it unveiled .NET, an object-oriented development environment and framework that sandwiches an abstraction layer between the operating system and programming language, thus providing language interoperability when writing software — each language can call upon code written in other languages.
When Microsoft announced .NET it also announced Version 1.0 of Visual Basic .NET, which bore little resemblance to the most recent version of Visual Basic 6.0, released in 1998. Indeed, mainstream support for VB 6.0 ended March 31, 2005, and Microsoft’s extended support for it ended March 31, 2008.
Try as they might, Microsoft has been unable to persuade many programmers to dump Visual Basic 6.0 in favor of Visual Basic .NET, or even better (from Microsoft’s perspective), adopt Microsoft’s own “enterprise class” language, C#.
A vestige of Visual Basic lives on as Visual Basic for Applications (VBA) which comes with every copy of Microsoft Office. Also, Microsoft’s Silverlight has made a run as a RAD tool for Web and mobile applications.
But nothing has ever topped good old Visual Basic in terms of usability and popularity.
Strangely, no one has tried to produce a worthy successor to VB, despite a potential market value of over a billion dollars for such a product.
Any takers out there?
Richard Grigonis is an internationally known technology editor and writer. He was executive editor of Technology Management Corporation’s IP Communications Group of magazines from 2006 to 2009. The author of five books on computers and telecom, including the highly influential Computer Telephony Encyclopedia (2000), he was the chief technical editor of Harry Newton's Computer Telephony magazine (later retitled Communications Convergence after its acquisition by Miller Freeman/CMP Media) from its first year of operation in 1994 until 2003. Read more reports from Richard Grigonis — Click Here Now
© 2021 Newsmax. All rights reserved.