Search This Blog

The Incredible Disappearing Hardware

In my first post I alluded to a number of issues but didn't go into detail on any of them. Over the next few posts, I'll expand on some of those issues. Today I'd like to look into my assertion that hardware is turning into software and explore the hardware/software dichotomy that exists in essentially every product we buy that has an electric pulse.

I'll start by making an equally surprising claim: hardware is becoming invisible. I know I'm not the first to say this. I have the distinct memory of hearing someone say it and thinking, "that's exactly what's happening," but unfortunately I can't remember where or when I heard it, or who said it. As it stands, I've been noticing that hardware is disappearing for a few years now, and every major new product announcement has only further convinced me that it's true.

So what does that even mean, hardware is becoming invisible? Of course it's not, you say. There's more electronics hardware than there's ever been. But in a physical sense, hardware has been shrinking in size. It's obvious that our phones, computers, and televisions are getting smaller, lighter, and thinner. Look at all the svelte new tablets and super-slim LED TVs compared to their predecessors of 5 years, or even 2 years ago. That trend has been going on for decades, though. Remember the bricks they passed off as cell phones in the '80s?

That's not what I mean, though. How about completely new types of products? Anything new cannot come out in a clunky, bulky initial version, or it  will be summarily passed over. It would have to be something incredibly novel, or it would be a joke, and rightly so. Beyond that, you would have to think of something that a smart phone or tablet can't already do with an app, and if there isn't an app, you can bet there will be soon. I have an app that can start my car's climate control to the temperature I want. No electric starter necessary. The car comes with that capability built in. I just had to download the app on my iPod and it talks to the car through the internets! That's just the tip of the iceberg for the iPod. There's a practically infinite number of things it can be, including a phone if I skype on a WiFi network. Oh, and it's a portable music library, too. I almost forgot about that. How many CDs does it take to hold 10,000 songs? How much hardware has this one little device made obsolete or irrelevant?

Okay, okay, let's push a little harder on this iPod idea, and lets throw in every other display device while we're at it because this is in no way Apple-specific. That means computers, laptops, ultrabooks, smart phones, tablets, and smart TVs (or any TV connected to a PS3/Xbox 360/Blu-ray player) are all part of this discussion. They are all becoming invisible. They are all turning into software. They are all just a display attached to the internet. Nobody cares what processor is in them as long as it's a clear, vibrant display that can show people what they want to see. The device doesn't even need to do what the user wants it to do directly because it can just be done in the cloud... with infinite upgrades to the software.

Increasingly, what you want to see and do with software can be done anywhere. You can be hardware, and even operating system, agnostic. All you need is an internet connection and you can get at your data and media from any device running any OS you please. You can even pick your cloud or use a variety. There's Dropbox, Google Drive, iCloud and Sky Drive just off the top of my head. They are your access, your backup, and your recovery of your data anywhere and all the time. Who needs physical backup media anymore? That hardware is disappearing.

I'm actually finding myself using my desktop computer less and less. The only thing I do on it anymore is edit photos and video because it's a focused activity where I don't mind sitting in front of a computer terminal for a while. Anything else I'd rather do on a different display in a different setting. There's no single device that's replacing the desktop computer, though. I play games and watch movies on my TV and PS3, I listen to music and do all kinds of little tasks on my iPod, I read books on my Kindle, I code (and now blog) on my laptop, and I want a Kindle Fire for general internet reading because I'm tired of using my laptop for that. The hardware doesn't matter; it's all software with a display that allows me to interact with it.

Here's another take on how hardware is becoming invisible. Remember that whole Intel-AMD battle being fought over who could build the fastest, most powerful processor? Who won? How about Google. Or maybe Facebook. No, really, does anyone even still care? Any of a number of processors could be in your devices, but in all likelihood, it doesn't matter to you. Somewhere during the race to maintain Moore's law, software decided, "This is good enough, thanks." Even for the more demanding applications, a 3 GHz dual- or quad-core processor and 4GB of memory seems to be plenty.

The processor performance race and the memory upgrade track have both kind of fizzled. Intel came out with a 3.8 GHz Pentium 4 in 2004. The architecture sucked, but let's ignore that. What's the fastest Intel processor you can get today? A 3.6 GHz Core i7-3820, and it's 9 years later. That should be 6 doublings, or 64 times the transistor count since the Pentium 4. I know the frequency couldn't keep increasing at that rate, but the frequency didn't increase at all. If it had, the Core i7 would probably be hotter than the surface of the sun. Where did all of those transistors go?! Not into extra cores because it's only a quad-core processor - that's four cores, not sixty-four. They mostly went into on-chip cache, integrated on-chip graphics, and extra features for software like hardware virtualization (think about that for a second). You might make the argument that most software can't take advantage of 64 cores, and that's probably right, at the moment. But I'm saying that software doesn't even need to take advantage of more cores. Most software doesn't need faster hardware, so hardware is becoming irrelevant because anything available is good enough for all practical purposes.

Memory is on pretty much the same stalled path. In 2008 a low-end system had about 4GB of RAM and you could load up a high-end system with 16GB of RAM. I just looked at Newegg.com, and memory in pre-configured desktops and laptops ranges from, wait for it... 1GB to 16GB. Hmm, just out of curiosity, I checked my order history, and in 2007 I bought 2x2GB of DDR2-800 SDRAM for $100. In 2010 I bought 2x2GB of DDR3-1600 SDRAM for $110. That would be the same amount, albeit faster, memory for slightly more money. Today you can get exactly the same memory for $30, and it's still considered a decent amount of memory. However, if memory had actually continued doubling every 18 months, I should be able to get 8 times 16GB, or 128GB of memory today. Well, I could get 64GB, but it would have to be in the form of 8 DIMMs and not many motherboards support that. Besides, what counts is the amount of memory jammed onto a single DIMM, and that certainly hasn't doubled three times since 2008.

I would say memory hasn't continued to grow exponentially because it no longer needs to. There are very few consumer applications that use more than a gigabyte, and the people that do use those memory-hungry applications aren't going to see significant performance improvements over 16GB. Software has stopped expanding in size to fill any memory available. Instead, software is expanding in kind to fill any desire available. The hardware is no longer the driver for solving new and interesting problems. You don't have to wait for hardware to advance enough to solve hard problems. Now the driver is the software, and if you can come up with better software to solve peoples' problems more easily or elegantly, you don't have to wait. The hardware is already there, it's already capable, and it's already virtualized and abstracted so you can ignore most, if not all, of the hardware details.

Let's look at another area where software is replacing hardware. Engineers have made tremendous advances in DC/DC converters lately, and those advances are enabling incredible efficiency gains in DC motors and LED lighting. Inside these new converter chips is a software control loop that reacts intelligently to the available input power and desired output power to extract maximum efficiency from the power source, usually a battery. The software is making these advances possible, and they are coming at a faster rate every year as people discover novel improvements to the software algorithms that control the converters.

Now imagine what happens when all of those advances in DC motors and LED lighting get combined in a car? Well, you can take out the engine, the fuel line, the gas tank, the exhaust system, the engine coolant system, and the oil. Put in a big DC motor and a ginormous battery, and you've got a Nissan LEAF... or a Tesla Roadster (sweet, but expensive). More electric car models are coming this year, and they are going to keep on coming because they are the future. These cars would not even be possible if it wasn't for all the software that's packed in them to control the efficiency of the motor and lights, and especially to manage the charging, discharging, and regenerative braking for the battery.

I have a LEAF, and I am constantly impressed with the software in it. Remember that it can be climate-controlled from my iPod? I can also check the charging status over the internet, download new charging station locations that are mapped in the GPS navigation system, and get updates on all kinds of environmental information. But the really thought-provoking feature combination is that the car is internet connected and the battery/motor performance is software controlled. Think about that for a moment. I'm not sure it will happen, but if Nissan released an update, I could download a performance upgrade for my car. How's that for virtualizing hardware and making it more flexible through software? Hey, I can dream, can't I? It's now theoretically possible.

There are many more examples of how hardware is turning into software, but that's plenty to show where the idea came from and where it's headed. The implications for the future are truly mind-boggling. What hardware do you think will turn into software next?

No comments:

Post a Comment