Thursday, January 27, 2011

Haynie's Hypothesis Part 01: The Personal(ized) Computer


At Connectify, we're lucky to work with some of the smartest, quirkiest, and coolest people around. The one staff member who truly embodies all of those qualities is our lead engineer, Dave Haynie. Besides having developed some of the most iconic personal computers of the '80s and '90s, Dave is a seasoned musician, documentary filmmaker, and the kind of guy that'll give you a Wikipedia-like response to the most esoteric question. Anyway, Dave spends a lot of the day staring at complicated schematics and tiny circuit-boards, but we're hoping to pull him away twice-a-month, to write about whatever he wants to, in a guest column that we're going to call, Haynie's Hypothesis.

Hello, Connectify blog readers! My name is Dave Haynie, and I've been an engineer, on the team here since the beginning. It's also worth noting that I've been around the personal computer industry in one way or another, since its inception. When I was a kid, I used to work on crazy projects all the time. On one occasion, I built a bomb that blew a hole in our street (back when home bomb-making was a bit more socially acceptable). I also constructed a variety of vehicles, including rafts and rocket cars. Mostly, however, I was fascinated by electronics. I built a radio for a science project in fourth grade, a laser for the junior high science fair, and at least one digital clock, among other things, every year of high school. I can also recall a  whole series of attempts to make a working electronic drum, and another run of pretty successful shocking devices (each one delivering hundreds or thousands of volts into friends, my poor sisters, etc.) 

But along the way, I discovered computing, about as early as a kid could discover computing in those days. When I was twelve years old, my dad was able to let me access a scientific mainframe computer at Bell Laboratories, on the weekends. This was a class ‘time-share’ system: the computer was who-knows-where (somewhere in the old Bell Labs building in Holmdel, NJ, that once housed 10,000 employees... they're turning that into a shopping center I hear), and you accessed it via a printing terminal. I started with games, and got bored pretty quickly. So then I took them apart, and learned some BASIC and FORTRAN.

Always at the mercy of the phone connection, the supply of thermal paper, and my dad not getting too busy to remember the terminal on Friday, this arrangement was far from ideal. A few years into it, however, they reorganized the phone system (Bell Labs had their own exchange, but it wasn't enough, so they were moving dial-ups to internal switches). I found other computers using what later came to be called ‘War Dialing,’ that is, just calling every 201-949-xxxx number. I did ultimately get in there, and found my way into a UNIX system, the original inspiration for what today is GNU/Linux, and the OS that most of the Internet runs on. Of course, I didn't know that at the time, but then again, Bell Labs hadn't quite thought much about passwords at that point, either. 

But then everything changes with the first wave of personal computing. The very first one I used was a friend's Commodore PET 2001 personal computer. The PET was designed by Chuck Peddle, the inventor of the 6502 microprocessor. Back in those days, you knew things like that, even as a personal computer user. The PET was a complete computer in one: keyboard, screen, storage (on cassette tapes), which you could own for $799.99. Most people bought them to learn to "use a computer,” which probably meant programming, in the BASIC language. BASIC, because Mr. Peddle had used Dartmouth BASIC on a mainframe somewhere, and managed to buy a perpetual license for Microsoft BASIC on the 6502 processor for about $10,000... from a very, very early version of Microsoft. You could buy application software, on tapes, but it was simple stuff by today's standards. The reason this was personal is pretty clear, a person or household could actually own this computer, and use it as much as they liked. 

Things changed again in the 1980s. The Commodore 64 and Atari 800 revolutionized home computing, making the computer "personal" in other ways. The computers became so affordable that the average family could now have several in house. They also grew to do more of the kinds of things a person might want to do on a computer. It also marked the first time that sales of software outpaced the number of folks writing their own. But still, you probably bought a computer because you were "into computers.” The computer itself was the hobby. This is also why, to this very day, there are still some people using at least a few of the more than 17 million Commodore 64s that were made over its product life (I wound up working at Commodore for 11.5 years, developing the Commodore 128 and being a chief hardware engineer on various Amiga products.)

This sort of computing continued to grow with the introduction of the IBM PC. "PC" of course stands for "Personal Computer". The IBM wasn't an amazing value, but at first, with a built-in hard drive, it allowed users to personalize the applications they'd always have with that computer. Similar innovations came from Apple's Macintosh, which added graphics, and Commodore's Amiga, which introduced more photo-realistic visuals, graphics accelerators, and high quality stereo audio. In short, the computer was evolving to accommodate us humans. This personalization increasingly opened the world of computing to people who had never thought of the computer itself as a hobby, but could use it to accelerate the thing they did like to do. First, with music production, then photography, artwork, and video, just about every discipline of art and science has been completely revolutionized by the personalization of the computer. 

The big contribution from the IBM PC was the PC Clone. This really opened up the world of the personalized computer. There are PC compatibles in every shape and size, from hundreds of hardware vendors. You can buy a PC compatible for $200, or spend well over $10,000. They can run Windows, which you can also personalize to your needs. If you're a computer expert, you can run GNU/Linux on the same hardware, and therefore personalize the very operating system you're running. There are even clones of other computing platforms that run on the PC, including AmigaOS (AROS) and BeOS (HaikuOS). 

After Commodore, I worked at several startup companies, working on some next-big-thing or another. One basic idea, which is starting to happen now, is that everything digital is really a computer... so why not just use that. In 2000, I had a set-top box which was a DVD player and an internet media/IPTV player, along with a DVB (digital television) player and it was, pretty much, also a personal computing device. You could surf the net, read your email, and we had planned to allow small user applications -- the basic idea of "apps" as found on iOS or Android today. A few years later, at another company, I was working on a way to connect all of your media devices through a home network. You’d only have to care about the thing you wanted to see. A tailor-made viewing experience could automatically follow you to any other networked viewing device in the house. And a big part of this was making it personal: the system adapted to your needs, and unlike more "advanced television" ideas, it actually made things easier to use. We wanted something our moms and dads would use, not just the kids. 

Today we're all witness to the next big jump in personal computing, which is just now really happening: mobile computing. This is moving the idea of personal computing from the PC for more variety of devices, just as some of my companies tried to do in the past. However, it's also making computing an even more permanent part our lives. The PC can, of course, be schlepped around in laptop form, with a battery good for a few hours. But it's not ever-present… you change your life around just to use the laptop -- open it, set it down somewhere, charge it frequently. Mobile computing is changing that.

These new devices go with you. The move to advanced tablets and smartphones is truly another milestone in the personal computing story. We call them "smartphones", but they really are networked personal computing devices. The typical smartphone has a CPU hundreds of times more powerful than that a fifteen-year-old desktop PC. It will surely boast more memory, more storage, faster GPU, etc. Of course, the "phone" part reflects communications, so I have my address book, Facebook friends, several email accounts, and a whole slew of personal data on this phone. I take notes on it: no pen and paper required. The same device is also a camera, camcorder, GPS unit, city guide, shopping assistant, guitar tuner, games console, the usual iPod/PMP stuff (movies and music), etc. It's all very personalized to what I do, even the look and feel of the user interface reflects my personality. 

The physical embodiment of the "personal" factor is that, in the words of that great philosopher Daffy Duck, "it's mine, mine, all mine". It's not uncommon to share a PC, even today, but it's just plain weird to use someone else's cell phone. But that it's always at hand? To many users, mobile devices have rapidly become essential, pervasive tools… maybe even an extension of their thinking process. The soft aspect of this is that it's mine because it's personalized, and it's more personal because it just works. You don't usually have to think about walking or speaking... just happens. The best of personal computing works similarly: it just happens. One of the main factors in enabling that behavior, particularly on a pocket device, is the network. First of all, the network connects me to the rest of the world, in ways people didn't really expect twenty years ago. New ways to communicate are being thought up all the time. But it also serves to make devices useful, since they can transfer information, sharing mine and allowing me to scan through terabytes of data for just the thing I need. I can "Google" some question right when it occurs, no need to walk to the PC. If I have an idea for a song or an article, I just write it, and if I do sit down to my PC, it's right there waiting... never anything for me to do. If I'm wandering in an unknown city, I can find a good place to eat, or even view an annotated world through my smartphone's screen. And if my smartphone were lost or destroyed, the replacement would automatically personalize itself in a short time, again, through the network. 

The network is also how we'll increasingly connect things together. You may use Connectify to link laptops together, but as things get smaller, more personal, and smarter, this is going to be everywhere. Today, I have to hook wires to my cameras or camcorders to transfer video. Eventually, however, I'll set up a policy in the device, and it'll just unload itself to my PC or smartphone. My tablet will be where I read the news, magazines, and books, and these will show up as the network connections are available. It's already true that things I change on my PC just show up changed on my smart phone, and vice-versa. I buy a song from Amazon on the Droid, and it'll be on my PC, nothing for me to do. There are criticisms that "The Cloud" is trying to bring us all the way back to the idea of computing that someone else controls setting the policies we live by. And while that can be true in isolated instances, I don't see the genie of personal computing going back into the bottle any time soon. Properly used, the network really is another fundamental step in the personalization of computer.