In the last two weeks, we have lost two people who had immense influence on digital era.
It is undeniable that Steve Jobs brought us innovation and iconic products like the world had never seen, as well as a cult following of consumers and end users that mythicized him. The likes of him will probably be never seen again.
I too, like many in this industry, despite my documented differences with the man and his company, paid my respects, and have acknowledged his influence.
But the “magical” products that Apple and Steve Jobs — as well as many other companies created owe just about everything we know and write about in modern computing as it exists today to Dennis Ritchie, who passed away this week at the age of 70.
Dennis Ritchie?
The younger generation that reads this column is probably scratching their heads. Who was Dennis Ritchie?
Dennis Ritchie wasn’t some billionaire meticulous wunderkind from Silicon Valley that mystified audiences with standing room only presentations in his minimalist black mock turtleneck with new shiny products and wild rhetoric aimed against his competitors.
No, Dennis Ritchie was a bearded, somewhat disheveled computer scientist who wore cardigan sweaters and had a messy office.
Unlike Jobs, who was a college dropout, he was Ph.D, a Harvard University grad with degrees in Physics and Applied Mathematics.
And instead of the gleaming Silicon Valley, he worked at AT&T Bell Laboratories in New Jersey.
Yes, Jersey. As in “What exit.”
Steve Jobs has frequently been compared to Thomas Edison for the quirkiness of his personality and inventive nature.
I have my issues with that comparison in that we are actually giving Jobs credit for being an actual technologist and someone who actually invented something.
It is important to realize that while indeed the man was brilliant in his own way, Steve Jobs was not a technologist.
Indeed, he had a very strong sense of style industrial design, understood what customers wanted, and was a master marketer and salesperson. All of these make him a giant in our industry. But inventor? No.
Dennis M. Ritchie, on the other hand, invented and co-invented two key software technologies which make up the DNA of effectively every single computer software product we use directly or even indirectly in the modern age.
It sounds like a wild claim, but it really is true.
First, let’s start with the C programming language.
Developed by Ritchie between 1969 and 1973, C is considered to be the first truly modern and portable programming language. In the 40 years or so since its introduction, it has been ported to practically every systems architecture and operating system in existence.
Because it is a imperative, compiled, procedural programming language, allowing for lexical variable scope and recursion, and allowing low-level access to memory as well as complex functionality for I/O and string manipulation, the language became quite versatile, and allowed him and Brian Kernighan to refine it and publish it as the “ANSI C” programming language.
In 1978, Kernighan and Ritchie published the book “The C Programming Language”. Referred to by many simply as “K&R” It is considered to be a computer science masterpiece and a critical reference for explaining the concepts of modern programming, and is still used as a text when teaching programming to students in computer science curriculums even today.
C as a programming language is still used heavily today, and it has since mutated into a number of sister languages.
The most popular, C++ (pronounced “C plus plus”) which was introduced by Bjarne Stroustrup in 1985 and added support for object-oriented programming and classes, is used on a variety of operating systems including every major UNIX derivative including Linux and the Mac, and is the primary programming language that has been used for Microsoft Windows software development for at least 20 years.
Objective-C, created by Brad Cox and Todd Love in the 1980s at a company called Stepstone added Smalltalk messaging capabilities to the language.
It was largely considered an obscure derivative of C until it was popularized in the NeXTStep and OpenStep operating systems in the late 1980s and early 1990’s on Steve Jobs’ NeXT computer systems, the company he formed after he was ousted by Apple’s board in 1985.
What happened “next” of course is computing history. NeXT was purchased by Apple in 1996 and Jobs returned to become CEO of the company in 1997.
In 2001, Apple launched Mac OS X, which makes heavy use of Objective-C and object-oriented technologies introduced in NeXTStep/OpenStep.
While C++ is used heavily on the Mac, Objective-C is what is used to program to the native object-oriented “Cocoa” API in the XCode IDE which is central to the gesture recognition and animation features on iOS that powers the iPhone and the iPad.
Objective-C also provides frameworks for the Foundation Kit and Application Kit that are essential to building native OS X and iOS applications.
Microsoft has its own derivative of C in C# (pronounced “C Sharp”) that was introduced in 2001 and serves as the foundation for programming within the .NET framework.
C# is also is the basis for programming the new Metro applications in the Windows Runtime (WinRT) for the upcoming Windows 8 as well as in Windows Phone 7.x. It is also used within Linux and other Unix derivatives as the programmatic environment for Mono which is a portable version of the .NET framework.
But C’s influence doesn’t end at C language derivatives. Java, which is an important enterprise programming language (and has itself morphed into Dalvik, which is used as the primary programming environment for Android) is heavily based on C syntax.
Other languages such as Ruby, Perl and PHP which form the basis for the modern dynamic Web, all use syntax introduced in C, created by Dennis Ritchie.
So it could be said that without the work of Dennis Ritchie, we would have no modern software… at all.
I could end this article simply with what Ritchie’s development of C means to modern computing and how it impacts everyone. But I would only really be describing half of a life’s work of this man.
Ritchie is also the co-creator of the UNIX operating system. Which, of course, after being prototyped in assembly language, was completely re-written in the early 1970’s in C.
Since the very first implementation of “Unics” booted on a DEC PDP-11 back in 1969, it has mutated into many other similar operating systems running on a huge variety of systems architectures.
Name a computer vendor, and every single one of them has had at some time an implementation of UNIX. Even Microsoft, which once owned a product called XENIX and since sold it to SCO.
Essentially, there are three main branches.
One branch is the “System V” UNIXes that we know today primarily as IBM AIX, Oracle Solaris, SCO UnixWare and Hewlett Packard’s HP-UX. All of these are considered to be “Big Iron” OSes that drive critical transactional business applications and databases in the largest enterprises in the world, the Fortune 1000.
Without the System V UNIXes, the Fortune 1000 probably wouldn’t get much of anything done. Business would essentially grind to a halt.
They may only represent about 10 to 20 percent of any particular enterprise’s computing population, but it’s a very important 20 percent.
The second branch, the BSDs (Berkeley Systems Distribution) include FreeBSD/NetBSD which form the basis for both Mac OS X and the iOS which powers the iPhone. They also are used to power much of the critical infrastructure that actually runs the Internet.
The third branch of UNIX is not even a branch at all — GNU/Linux. The Linux kernel (developed by Linus Torvalds) combined with the GNU user-space programs, tools and utilities provides for a complete re-implementation of a “UNIX-like” or “UNIX-compatible” operating system from the ground up.
Linux of course, has become the most disruptive of all the UNIX operating systems. It scales from the very small, from embedded microcontrollers to smartphones, to tablets and desktops and even the most powerful supercomputers.
One such Linux supercomputer, IBM’s Watson even beat Ken Jennings on Jeopardy! while the world watched in awe.
Still, it is important to recognize that Linux and GNU contains no UNIX code at all — hence the Free Software recursive phrase “GNU’s not UNIX.”
But by design, GNU/Linux behaves much like UNIX, and it could be said that without UNIX being developed by Ritchie and his colleagues Brian Kernighan, Ken Thompson, Douglas Mcllroy and Joe Ossanna at Bell Labs in the first place, there never would have been any Linux or an Open Source Software movement.
Or a Free Software Foundation or a Richard Stallman to be glad Steve Jobs is gone, for that matter.
But enough of religion and ideology. We owe much to Dennis Ritchie, more than we can ever possibly imagine. Without his contributions, it’s likely none of us would be using personal computers today, sophisticated software applications or even a modern Internet.
No Android smartphones, no fancy DVRs and streaming devices, and no Macs and iPads for Steve Jobs and Apple to make Amazingly Great.
No “Apps for That.”
To Dennis Ritchie, I thank you — for giving all of us the technology to be the technologists we are today.